VR Game Prototype
First round of design and development for an Oculus Quest game
I was ready to level-up my XR design and dev skills, and I thought that building a VR game would be a great way to learn a lot of new tricks. And, wow—it was not exactly an easy task. But I loved the challenge and was thrilled to finally have a large chunk of my time devoted to designing a game, writing my own scripts in Unity, and creating 3D models in Blender. And I ended-up with a pretty solid gameplay prototype to work with for the next round of work on this game concept.
The Project
The  Vehicle
The Tools
Big-Little Details
The Outcome

About The Project

I explored an array of VR concepts and design theories via this project, but the decision to tailor the game for the Oculus Quest platform was probably the most consequential. The headset has so many good attributes—untethered, inside-out tracking, computer vision, and more—that I felt it was important to get in-depth experience with developing for this very popular device.

But there is a drawback: it's still a mobile device and has performance optimization needs that will crush most of your game art and VFX ambitions. Even simple transparent materials can drag down your frame rate, and you can probably forget about using any post processing.

I still have nightmares about whether or not my next build will run at 72FPS, but I guess I'm thankful that I at least know how to isolate performance issues and find solutions.

As for the game concept, it's a sci-fi archeology puzzler that places you in a hovering, moon-surface-rovering vehicle for exploring a system of planetary satellites (AKA moons).
For player experience design, these are the goals I aimed for:

Ridiculously Simple Interaction Design
I want players to use controllers as natural extensions of their hands rather than controllers with specific buttons combos to press. So no "press A to get this UI, squeeze the grip and not the trigger to select this object, use the joystick to teleport, etc." My aim is for players to use a simple "squeeze your hand to access this object" or "touch that plane to activate" for most of the experience.

Comfortably Explore A Large World
Placing the player in the vehicle serves the narrative and also helps deliver key elements of the gameplay experience. You can fly the vehicle to distant locations while still having the freedom to physically walk around the small cabin if you want to move around a bit. There is no awkward teleport system to learn (an aspect of VR where users frequently lose focus on their immersion in an experience), and I even added basic ergonomic designs for sitting and standing during gameplay.

Chill Vibes
I'm still not settled on the specifics of gameplay constraints, but I'm leaning towards a game where you can explore at your own pace and get rewarded as you discover the secrets of each world. No lives to lose, scores to achieve, or clocks to beat: just the calm satisfaction of making interstellar archaeological discoveries.

The Vehicle: A Hovering, Roving, Energy Reaper

One of gameplay systems I wanted to design and develop first was the vehicle, which is a spacecraft that hovers and ranges over terrain. That gameplay mechanic is one of the most important elements of the experience, and needs to be as fun as possible fun without inducing motion sickness (which is not easy when the main element of your VR game is a moving vehicle).
So I designed and coded three essential components for vehicle locomotion:

Hand Directed Propulsion
The player picks up a "navigation key," then points in a direction, and squeezes their hand to propel the vehicle to a location (very advanced flight technology).

The vehicle never touches the ground and softly hovers about one meter above terrain—makes the gameplay experience feel as smooth as butter.

Cabin Rotation
This was probably the most challenging element of vehicle locomotion to code: enabling the player to use the navigation key to smoothly rotate the vehicle around its vertical axis—without inducing motion sickness. There's a surprising amount of linear algebra involved in that design.

The Tools

The vehicle is equipped with a few tools that are specifically designed for handling the mineral compounds and computing systems found in this game's world. The player operates these tools via a holographic UI.

The tools in this prototype are a quantum decoder that can interact with alien computer systems; a magnetic crane that is used to handle highly reactive fuel cells; and a seismic conductor that basically explodes structures made of a specific material.
Players use the quantum decoder to interact with the world's mechanical systems
The magnectic crane lifts and carries objects that are crafted from a specific material
the seismic conductor fullfills the desire to blow things up

Some Of The Big-Little UI Details

The game's holographic augmented reality UI doesn't necessarily require using the laws of physics for controls. But some use of physics does make the UI feel more natural and responsive to your actions, so I built notched levers/switches, sliders, and buttons that feel like they are actually sliding and snapping into place.

And I also coded some transitions to enhance the experience of these diegetic UIs—fill animations, eased color transitions, etc.

The Outcome: A Thorough Education In Many Aspects Of Developing A Quest App With Unity

There is no easy way to neatly summarize everything I learned and the skills that I expanded over the course of this project. This project review is only the very tip of a large iceberg full of multiple iterations and options for designing and building this game concept—along with an unhealthy amount of Youtube videos watched, questions asked to my XR and Unity developer mentor (shout out to Ozzie and Circuit Stream!), and Oculus and Unity blog posts. Anyone who has spent any amount of time learning this stuff knows the effort involved.

It was definitely a critical point along my path as an XR specialist where I became very comfortable using the technical tools and modes of thinking (creative, design, and development methods) needed for building XR apps.

I'm taking a break from this project while I work on a few small AR experiments and demos for PC VR (where I can add experiment with some much fancier art and VFX).
Here's a list of a few topics I remember covering over the course of this project:

Mobile VR Development (and tons of info about performance optimization)
Creating VFX with Unity's Shader Graph
Creating VFX with Unity's Particle Systems
Using Unity's UI Components in VR
Nondestructive 3D Modeling with Blender
Scripting My Animations Where Possible Instead Of Using The Timeline
‍Quite A Bit About Unity's Current Physics System
Unity's Universal Render Pipeline
Using Shader Graph To Make Very Nice But Really Complicated Clouds