← Back to Projects

Haptix - Movie-based haptic immersion for Android

Haptix captures audio playing on your device in real-time and creates synchronized haptic feedback based on environmental sounds. Human voices are filtered out to focus purely on atmospheric immersion.

Kotlin Jetpack Compose Material 3 MVVM Architecture Hilt Coroutines Flow Android AudioPlaybackCapture API Foreground Service Android SDK
Visit Project

Overview

Haptix is an open-source Android application that transforms real-time audio into intelligent haptic feedback. It enhances the movie-watching experience by converting environmental sounds into synchronized vibration patterns. Instead of producing random vibrations, Haptix analyzes audio frequencies, reduces dialogue influence, and reacts primarily to cinematic sound elements such as explosions, footsteps, ambient tension, rain, and impact effects.

The purpose of Haptix is simple: to make movies feel physical.

Built with performance, privacy, and immersive design in mind, Haptix runs entirely on-device and focuses on delivering a smooth, responsive, and distraction-free experience.

Key Features

  • Real-time audio analysis and processing
  • Frequency-based vibration mapping
  • Dialogue suppression to emphasize environmental sound
  • Multiple genre-based experience modes
  • Customizable haptic intensity
  • Clean Material 3 user interface
  • Fully offline functionality
  • No tracking, no analytics, no data collection

Experience Modes

Haptix includes specialized vibration modes tailored for different content styles.

Action Mode generates strong and dynamic vibration bursts for high-intensity scenes such as explosions and combat sequences.

Horror Mode delivers sharp, sudden feedback patterns to enhance suspense and jump scares.

Cinematic Mode provides smoother and more subtle vibration layers for immersive storytelling.

Custom Mode allows users to fine-tune intensity and responsiveness for a personalized experience.

How It Works

Haptix uses Android’s AudioPlaybackCapture API to capture system audio while media is playing. The captured audio signal is processed in real time to detect amplitude shifts and frequency patterns.

  • Low frequencies produce deeper vibration responses
  • Sudden peaks trigger short vibration bursts
  • Sustained sounds generate continuous feedback

All processing occurs locally on the device. No audio is recorded, stored, or transmitted.

Tech Stack

  • Kotlin
  • Jetpack Compose with Material 3
  • MVVM Architecture
  • Hilt for Dependency Injection
  • Coroutines and Flow for asynchronous processing
  • Foreground Service for continuous audio monitoring

The architecture follows clean separation of concerns to ensure maintainability and scalability.

Requirements

  • Android 10 or higher
  • Device with vibration hardware
  • Screen capture permission enabled
  • Background activity allowed for optimal performance

Installation

Install via APK

Download the latest APK release and install it on your Android device. Grant the required permissions when prompted.

Build from Source

Clone the repository:

git clone https://github.com/HasibulHasan098/Haptix.git

Open the project in Android Studio, sync dependencies, and run it on a physical Android device.

Privacy

Haptix does not collect user data.
No analytics.
No background network communication.

All audio processing is handled locally and temporarily in memory.

Vision

Haptix explores the intersection of sound and touch, pushing beyond traditional audiovisual experiences. By adding tactile feedback to cinematic content, it introduces a new sensory layer to storytelling and mobile entertainment.

This project represents experimentation in real-time signal processing and immersive mobile interaction design.

The Challenge

Building Haptix involved several technical and design challenges, especially because it operates in real time and interacts directly with system-level audio and hardware components.

One of the biggest challenges was real-time audio processing. Capturing system audio using the AudioPlaybackCapture API while maintaining low latency required careful optimization. Any delay between sound and vibration would break immersion, so processing had to be lightweight and efficient.

Another major challenge was dialogue suppression. Movies contain overlapping layers of sound, and isolating environmental effects from speech is not perfectly straightforward. Designing frequency-based filtering that reduces vocal dominance while still preserving impactful background sounds required multiple tuning iterations.

Device hardware variability also posed issues. Different Android devices have different vibration motors with varying strengths, response times, and precision levels. Creating a consistent experience across a wide range of devices required adaptive intensity scaling and testing.

Battery consumption was another concern. Continuous foreground audio processing and vibration feedback can increase power usage. The architecture had to balance performance with efficiency to prevent excessive drain.

Handling Android permissions, especially screen capture permissions and background service execution restrictions, required careful lifecycle management to ensure stability without crashes or unexpected service termination.

Finally, designing an immersive yet minimal user interface was important. The UI needed to stay clean and modern while providing enough control for customization without overwhelming the user.

These challenges shaped Haptix into a performance-focused, privacy-respecting, and immersive Android application.

The Solution

To overcome the technical and architectural challenges in Haptix, a performance-first and system-aware approach was implemented.

For real-time audio processing, lightweight signal analysis techniques were used instead of heavy digital signal processing libraries. By focusing on amplitude detection and frequency band mapping rather than full spectrum decomposition, latency was minimized and vibration feedback remained tightly synchronized with audio output.

To address dialogue suppression, frequency-based filtering was applied to reduce the dominance of vocal ranges while amplifying environmental sound bands. Continuous tuning and real-device testing helped balance clarity and immersion without muting important cinematic elements.

Hardware variability across Android devices was handled by introducing adaptive intensity scaling. Instead of hardcoding vibration strength, the system dynamically adjusts output levels based on detected amplitude ranges. Custom Mode also allows users to manually fine-tune intensity according to their device’s vibration motor capability.

Battery efficiency was improved by running audio processing inside a properly managed foreground service. Coroutines and Flow were used to handle asynchronous operations efficiently, preventing unnecessary CPU wake-ups and reducing background overhead.

Permission and lifecycle management were handled carefully by integrating Android’s modern APIs and structured architecture. The app ensures screen capture permissions are requested safely, and the foreground service is managed according to system guidelines to avoid unexpected termination.

From a design perspective, Jetpack Compose with Material 3 was used to create a clean, responsive, and minimal interface. The UI focuses on clarity and usability while still offering meaningful customization options.

Together, these solutions resulted in a stable, immersive, privacy-focused application that delivers synchronized haptic feedback without compromising performance or user control.

The Outcome

Haptix successfully delivers a real-time, immersive haptic layer that enhances movie and video experiences on Android devices. The application achieves low-latency synchronization between audio and vibration, creating a tactile response that feels naturally connected to on-screen events.

Through optimized audio analysis and adaptive vibration scaling, the app performs smoothly across a wide range of devices while maintaining reasonable battery efficiency. The dialogue-aware filtering approach ensures that feedback focuses on environmental and cinematic elements rather than distracting vocal vibrations.

The clean Material 3 interface provides an intuitive user experience, allowing users to switch modes or customize intensity without complexity. The architecture remains scalable and maintainable, making future enhancements and feature additions straightforward.

Most importantly, Haptix proves that immersive sensory computing on mobile devices is practical and achievable using modern Android tools. It demonstrates how sound and touch can be combined to create a richer, more engaging entertainment experience without compromising privacy or performance.