Flow through moments together - The Menagerie logo

The Menagerie

This project was developed for the Cannes XR Challenge in June 2020 over the course of 3 weeks.

The Menagerie is a social VR app thatshort-form

TEAM:
Bracey Smith (Experience Designer/prototyper)
Elena Piech (Business Development)
Saul Pena Gamero (Developer)
Jake Maymar (Creative Lead)
Sounak Ghosh (3D Artist)
Ru Mboya (Prototyper)
Kevin Lee (Designer)

Ideation & User Flow

The idea of content discovery as a social process was nothing new, but what made this project unique was how we tackled the experience. The core aspects that we decided to tackle in the design process were:

  1. How can we seamlessly transition between experiences?
  2. How can we reduce friction in discovering new content?
  3. How can we emphasize the joy of sharing a moment of discovery with others?

The seamless transition was done by a cool technique called Quantum portation. If all players in the group placed their hands on the orb in the center of the 5x5 play area, the environment would fade into the new environment.

I designed a user flow to map how we can reduce friction, and emphasize the discovery of content.

user flow diagram of menagerie app

Prototyping a Short-form content

I developed an interactive piece of content for the prototype called Octodance for The Menagerie app demo.

In this piece each player controls a tentacle of the Octopus with their hands, and the 4 players have to coordinate all the tentacles to sink the ship.

I used a combination of Maya and Tilt Brush to create the 3D assets for the experience, and created an IK rig for the tentacles using the Animation Rigging package in Unity.

3D model of aquarium tank with octopus inside and a ship floating at the top

3D model created in Tilt Brush and exported into Unity

in-game footage of players in VR controlling the arms of the Octopus

Octodance multi-user prototype using Photon plugin

Interaction Design - Slider

I also helped design and prototype a Slider interaction for the Menagerie app.

What are the current problems with sliders in VR?

  1. Easy to press on accident when moving hands
  2. Unsteadiness of hand movement and lack of haptic feedback makes the interaction unsatisfying and frustrating
  3. Not enough feedback when a button is pressed (or hovered over)

How can we make this more intuitive?

  1. By separating the slider interaction into 3 volumetric sub-interaction zones - initialize, decision, and activate zone, we can break down the steps in the thought process of a player’s interaction with the slider.
  2. When a player's fingertip enters the initialize zone, it will grow in size and open up the decision zone. This is similar to a hover state which suggests a possible interaction but does not trigger it immediately.
  3. In the decision zone the player can slide the button across the slider, therefore making a decision by doing so. In this zone, the slider button is locked to one axis (x-axis), so that the unsteady motion of hand does not affect the button’s sliding motion.
  4. Reaching the activate zone triggers the result/purpose of the button.
Interaction design diagram of the 3 zones of the slider

2D representation of the zones/states

Motion design for slider prototype in After Effects

User Testing

Compensating for z-depth

After prototyping it, we learned that our hands tend to drift forward (into the z-axis) due to the lack of a physical collision. This moves the fingertip out of the initialize zone and the action is cancelled even though the user did not intend to do so.

To fix this, we decided to allow an infinite leeway in negative z-space (i.e. area behind the button). So players can push into the initialize zone as far as they want and it will not cancel the interaction. However, if the player retreats their hand back in positive z-space (towards the player’s own body) and out of the initialize zone, this will cancel the interaction.

gif of user sliding the slider in VRgif of slider implemented in a menu

A component for larger design system

This slider interaction was applied to a hand menu that activates by turning one’s palm up towards oneself. The versatility of this interaction proves that it can become a component of a larger design system for UIs with hand tracking.

Custom Shader

We explored Universal Render Pipeline (URP) in Unity to create the rainbow effect for the button and fingerprint. The effect was achieved using a custom shader built in Shader Graph.

gif of rainbow shader in Unityimage of custom rainbow shader on a ring, with fingerprint in the middle

Simulating touch through audio

In the brain’s sensory dimension, temporal frequency is connected between hearing and touch. A research study demonstrates that auditory stimuli can interfere with tactile frequency perception. This is dependent on frequency. This could be interesting to explore in the future as a solution to compensate for lack of haptic feedback by an appropriate auditory cue.

Reference: Temporal Frequency Channels Are Linked across Audition and Touch

Content Strategy

I also participated in developing a business plan for the platform.

The Menagerie is an app for content creators and discovery. Our strategy involved several phases - first, to build a community and attract content creators, then to build tools and promote discovery. Gradually the content would be curated algorithmically according to the user's preferences and previous interactions in virtual worlds.

Table and diagram explaining the value to creators and explorers to show viable content strategycreator, filter, explorers, experience, feedback, repeat cyclemulti-year road map diagram 2020 to 2024

Credits to Kevin Lee for designing the slides of the pitch deck.