Freehand Assistant

Voice and gaze-controlled hand interactions in XR to enable people with hand disabilities.

blue hands with icons on the wrist

Hand tracking - an emerging input modality.

Hand or finger tracking is a technique to calculate the position of the user’s fingers in real-time and display the user’s hand as 3D object in a computer.

gif of user trying hand tracking while in a VR headset

Problem Statement

In XR products, the use of hands (controllers or hand-tracking) are vital for accessing the full range of interactivity. How might we reproduce this for people with arm & hand disabilities?

Research

In the U.S. alone, 1 out of 50 people are affected by limb paralysis (that's 5.4 million people) and another 2 million people are amputees.

The primary user would fall under one or more of these 4 cases.

icon of left hand amputee person

Permanent disability

People with limb paralysis, cerebral palsy, spine injury, amputees, etc.

icon of person with hand injury

Injury/temporary disability

In case of injuries like fractures, sprains, pain or other medical conditions.

icon of person carrying a box

Situational or task-based necessity

When one or both hands are occupied with other tasks.
Example - for doctors during surgery procedures.

icon of person carrying a box

Enterprise companies using hand tracking tech

Employees, managers and clients who will have to use hand tracking devices as new technologies get adopted.

Interviews

I interviewed 8 people, both from the disabled community as well as developers/designers in the XR industry.

Disabled community

I chatted with people at XR Access 2020 online conference and gathered insights from a panel discussion about mobilities and motricity needs to better understand their pain points with VR/AR technology.

“I can only use one of my hands, so it's difficult if the controller needs both hands.”

"I should be able to specify mobility needs, and carry it with me across devices and platforms.”

“Plug and play nature is what we are looking for in VR.”

Developers & Designers

I gathered feedback from industry professionals who have experience with hand tracking applications and received valuable insights from their projects.

“It is frustrating when interactions feel like they need to be as surgical as a mouse, not organic like a hand.”

"Grabbing objects in VR needs to have clear feedback to compensate for the lack of haptics.”

“Simulated hand physics must be improved to get the right feeling of grip and weight for different objects.”

Participatory Observation

Hand tracking Hackathon organized by The Glimpse Group Company in NYC.

In a team of 3, we create a slider interaction. This gave me a "hands-on" experience on the design process and functionality of such interactions.

gif of user sliding the slider in VR

Hand tracking Hackathon hosted by Futures NI and SideQuest.

In a team of 4, we create a gesture-based touchless UI for navigating through a multiple choice quiz in a museum.

hand gesture based interaction in VR

KEY INSIGHTS

Feature Categorization

To test the usability of Freehand Assistant, I chose to adapt a training scenario from Hololens 2 partner spotlights - PTC and Howden (watch). This scenario was also demoed by Microsoft at the keynote of Mobile World Congress 2019.

I used card sorting to prioritize the most recurring interactions.

They are categorized into 3 key features.

check box icon

Step-by-step Instructions

hand clicking icon

Point & Click Functionality

icon of person standing inside a circle

Teleportation & Movement

Freehand Assistant is designed to be as device-agnostic as possible. After doing a comparison of UI frameworks and VR/AR devices, I found that this was possible by using input modalities like voice and gaze between HMDs.

Voice

I asked people doing kitchen tasks to say their actions aloud, as if giving commands to one's hand. This experiment probed into how we verbalize motor decisions.

Often, a person would look at an object before verbalizing their action, and this act of looking would assume that the hand knows what is being "looked at" when receiving the motor command. This was a crucial insight. The user always expects that the hand shares the same mind as him/her.

Initial sketches

Freehand with a digetic indicator (like wrist band, shapes or wings)

User Flow

In my prototype, the user will undergo a VR training to repair a gear failure in one of the engines in a manufacturing plant.

User flow diagram for training scene

Prototyping

If possible, it is always better to test the idea in physical reality, before jumping into XR.

I attached a laser onto a cap to simulate the gaze pointer and stuck a camera on my face to capture a first-person perspective. This helped me discard assumptions early.

User Testing With Wireframes

On-boarding

For the lo-fi prototype, I created wireframes in Figma to mimic a VR scenario. It started with an on-boarding to introduce the voice and gaze combo-interaction style.

Teleporting

Voice command "teleport" activates a teleportation trajectory from the hand to the floor. This trajectory follows the user's gaze anywhere on the ground plane.

Step-by-step Training

The user goes through a step by step training procedure where the task is to fix a gear failure in the engine of a manufacturing plant.

Designing fluid interactions.

Grab objects

By saying "pick up" or "grab", objects will fly to your hand.

Locomotion

Move around the virtual space by aiming a path with your gaze.

Feedback icons

Receive visual feedback for every action to communicate user's intention.

What's Next?

The next phase involves developing a mid-fidelity prototype and doing lots of user testing. Adding features and expanding functionality for more than just training scenarios is also on the radar.

More usability testing

To make it as robust as possible, more testing is definitely needed.

Expanding functionality

Accounting for other scenarios and adding features accordingly will be the next phase.

Natural Language Processing

ML and natural language processing would increase the accuracy of voice recognition.

Project Learnings

1. Categorize and prioritize.
Designing for XR is multi-dimensional. To reduce complexity and avoid getting overwhelmed by all the possibilities, it is useful to categorize the user's needs and prioritize functionality.

2. Learn from reality; simulate in virtuality.
Experiences are founded on our perception. Therefore by learning about how we sense things, in reality, we can understand how to better simulate in XR mediums.

3. Test in VR as often as possible.
Behavior in VR emerges out of playtesting in VR.

Embracing diverse abilities is an opportunity to discover superpowers.

What if we could use more than two hands?

Designing for accessibility does not have to be limiting, but instead may become a gateway to unlocking capacities beyond cognitive and physical limitations.