Research / AR, VR & Haptic Experiences

Wearables and Extended Reality

Timeline: 2021-2023
Focus: Research, Prototyping, User Studies
Wearables and Extended Reality
Context
My Role

Throughout my graduate studies and early career, I explored the intersection of wearable technology, spatial computing, and human-computer interaction. My work spans from research prototyping to published studies, focusing on how emerging technologies can enhance accessibility and create more immersive user experiences.

I collaborated with cross-functional teams including researchers, engineers, and domain experts to design, prototype, and evaluate novel interaction paradigms for VR and AR platforms.

Research Focus

My extended reality work centers on three key areas: accessibility through multimodal feedback systems, spatial interaction design for immersive environments, and rapid prototyping methodologies for emerging platforms. Each project below represents a different facet of this research journey.

Tools & Platforms: Unity, C#, Meta Quest, Haptics Studio, ARCore, Mixed Reality Toolkit

Milestones & Achievements
  • Published research at ACM ASSETS 2024, one of the premier conferences in Human-Computer Interaction for accessibility.
  • Designed and developed functional AR prototypes using Google GeoSpatial Kit for educating on a historical place.
  • Conducted user studies with diverse participant groups to validate interaction designs and gather qualitative insights.

The following showcases key projects from my wearables and extended reality portfolio.

1. SoundHapticVR: Spatial Haptic Feedback for VR Accessibility
Research Context

Virtual Reality experiences rely heavily on spatial audio for immersion and navigation. However, users who are deaf or hard of hearing (DHH) miss critical audio cues that indicate direction, distance, and environmental context. SoundHapticVR explores how head-mounted haptic feedback can provide an alternative sensory channel for spatial sound information.

My Contribution
  • Designed the interaction framework mapping spatial audio properties to haptic feedback patterns around the head.
  • Developed functional VR prototypes in Unity with integration to bHaptics TactSuit for haptic output.
  • Conducted user studies with DHH participants to evaluate the effectiveness of different haptic encoding strategies.
  • Main author of the research paper published at ACM ASSETS 2024.
Outcome

The research demonstrated that head-based haptic feedback can effectively convey spatial audio information, with participants successfully identifying sound direction and urgency through haptic patterns alone. The findings contribute to making VR more accessible for the DHH community.

Behind the Scene: Prototype Development & User Studies
SoundHapticVR Component Breakdown
SoundHapticVR Component Iterations
SoundHapticVR User Study
For full research details and findings, or Access Full Paper
2. AR Geospatial Prototyping
Project Context

Location-based AR experiences require precise positioning and contextual awareness to create meaningful interactions with the physical world. This exploratory project investigated how geospatial APIs and visual positioning systems (VPS) can enable persistent, world-anchored AR content for educational and cultural heritage applications.

My Contribution
  • Prototyped AR experiences using Google ARCore Geospatial API.
  • Designed patterns for discovering and interacting with location-anchored content.
  • Conducted field testing to evaluate positioning accuracy and user experience across different environments.
  • Created documentation and design guidelines for geospatial AR development.
Outcome

The project produced functional prototypes demonstrating how AR can surface historical and contextual information tied to specific physical locations, along with insights into the technical constraints and design considerations for building reliable geospatial AR experiences.

Behind the Scene: Development Process & Field Testing
AR Geospatial Testing
AR Unity Development
For full project details and technical exploration,
3. Perceptual Study on Spatial Audio Algorithm @Bose
Bose Perceptual Study - VR Experiment Setup
Project Context

As part of my Spring 2023 Co-Op at Bose Corporation R&D Labs, I was tasked with understanding the magnitude of perceptual errors in audio externalization algorithms and the importance of personalized HRTFs (Head-Related Transfer Functions) for spatial audio rendering.

Tools: Matlab, C#, Unity, Python, OptiTrack Motion Capture, Bose QC35, Meta Quest 2

My Contribution

As a Technical Intern Co-Op, I was responsible for proposing an experimental design, building the prototype, and conducting user studies. I also showcased the experiment design setup at Bose's Annual Internal Tech Event "Bose Lab Expo" in 2023.

Experiment & Key Findings

We designed a localization accuracy experiment to evaluate how well the center image of sound sits in the virtual auditory 3D space, comparing generic HRTFs against personalized measurements. The study involved 8 "Critical Listeners" whose personalized HRTFs were measured, using Bose QC35 for audio playback and Quest 2 for the visual environment.

Our findings showed negligible elevation error overall—while participants occasionally perceived the sound slightly higher, the offset remained close to center. More importantly, personalized HRTFs consistently improved localization accuracy for the majority of subjects, validating the value of personalization in spatial audio rendering.

Additional Contribution
  • The VR app and experimental setup were showcased at the Bose Annual Internal Research Exhibition, receiving positive qualitative feedback—especially for the use of a multimodal VR setup to study sound perception.
  • Wrote usage documentation to support both task execution and future development, enabling researchers to continue work without requiring extensive VR knowledge.
Behind the Scene: Experiment Setup & Results
Bose Perceptual Study - Experiment Process
Bose Perceptual Study - Final Setup

Note: Data analysis and detailed processes are not disclosed due to NDA; the full methodology can be shared upon request.

4. Other Contributions
Visceral Notice for Eye Tracking

I prototyped an experience to enhance user awareness of the eye-tracking feature in use. This included designing concepts for obtaining user consent to track eye data and clearly indicating when tracking is active. The prototype was later used in an academic study, which resulted in a full paper published at an international conference.

To read the full research paper, click here.
VR Eye Tracking Data with Pupil Labs Integration

This was my first project using Unity, where I experimented with building a VR interface to explore the relationship between visual memory and saccadic eye movements. I created a simple Visual Memory Game in VR, integrated Pupil Labs' eye-tracking framework, and captured users' eye-movement data for analysis.

Beyond these research contributions, I have worked on various other XR and wearable prototypes. Feel free to reach out if you'd like to discuss XR research, haptics, and spatial computing.