The Goki Project
What: VR demo for SteamVR headsets demonstrating a simple gesture based event listening system
When: September - October 2019
Who: Me – Unity3D, programming, gameplay design, graphics, sound design. Worked with two other team members who contributed art, design ideas, and testing
Why: Use arm motion based VR gestures to empower players to embody the power and abilities of their childhood imaginations
How: Unity 2019, SteamVR Unity plugin, Source control via Git
The primary design philosophy for the Goki Project was to make the player “feel” like they had powers from the Dragon Ball universe. This meant that to performing actions in-game had to be as closely associated with how you might perform them in the backyard while playing pretend as a child. Embracing this design ethos required creating a reliable gesture recognition system.
Implementation Highlights
Simple and pseudo-advanced VR gesture input
A solution for simple gesture detection was built by comparing player hand movement and rotation against predefined trajectories to trigger action events.
Methods to add complexity to gestures without requiring external assembly processing were implemented by adding different activation methods to trigger various possible “listening” states (button presses, starting pose, etc.)
Flight supporting locomotion
The SteamVR input system and gesture system were combined to develop smooth transitions between walking and flying functionality.
A controllable direction flight system was built using the transforms between the player’s hands and head
Challenges
C# data processing speeds
While the demo was playable and provided the desired experience, recording and processing gesture patterns within Unity was quite intensive. An external C++ library to handle this processing would be a great addition to the approach