Inner Light

An embodied VR meditation on Light that uses brain activity (EEG) in real-time to visualize a virtual body (avatar). This is a 9-month long project for my Masters thesis. It is made using Unity (C#) and Emotiv software (for EEG signal processing). I am the sole contributor for all design, programming and research.

This hybrid art + science experiment coalesces three topics - (1) embodied cognition, (2) EEG neurofeedback, and (3) meditating on Light. It is an artistic project to explore the subject of meditation and self-awareness through embodied cognition and brain waves.

Inner Light provides guided audio-visual cues based on a player’s brain activity (EEG) to induce a relaxed but focused mental state. It offers an unique conceptualization of the body-mind complex, asking questions about who we are in relation to our body + mind, and who we are beyond it. Through this experience, I invite participants to embody their own mental frequencies and, at the climax, discover the feeling of ascending the body.

See full documentation or explainer video.

Onboarding for EEG Calibration in VR

I designed a UI to onboard users to the EEG calibration process while wearing the VR headset. This was essential to connect the EEG data session to the start of the meditation. Also the UI gives direct feedback on the "focus", "stress" and "relaxation" levels to the user.

Setup: Hardware + Software

User Testing

The VR experience was extensively tested with a range of people and iteratively improved through user testing and feedback. I also encountered challenges like variability in EEG data and had to adapt my VR experience to account for that variability without losing fidelity.

Github Repo