VR Design And Development
Experimenting with Oculus Quest hand tracking
Oculus' new hand tracking system has the alluring benefit of letting you put down the controllers and use your bare hands for navigating VR experiences. While this very natural human computer interface does have some challenges, it's a lot of fun designing interactions for the new feature.
Grasp Interaction
Diegetic Controls
Diegetic UIs
VFX Graph

Adding A Grasp Interaction

The trackable hand assets in the Unity SDK have an option for adding physics colliders, which means you can have some fun nudging objects around a scene in VR. But after flexing my virtual fingers and knocking objects around, I had an urge to pick things up and toss them. So I decided to add grab/grasp functionality to the SDK's hands.

For this small test, I put together a simple system of sensors for fingertips and palms to make it possible to implement the same kind of hand grabbing design patterns used with hardware controllers.
The Out-Of-The-Box hands with physics but no grabbing function (can't pick things up)
Hands with a very basic grabbing feature (Now I can pick things up)

Using Diegetic Elements To Enable Player Controls

Hardware controllers have a lot of good features that aren't possible with human hands in VR. Haptics is the most obvious and not easy to replicate. Physical buttons for player and game controls are another useful hardware feature that is missing, but much easier to replace with virtual controls.

For a VR app, your controller hardware is usually how you move your virtual body around large scenes—locomotion that is controlled via joysticks, touch pads, or buttons. So I put together a "wrist band" method for digitally recreating a hardware button for locomotion (which is via jetpack for this project).
Another option for replacing hardware controls is gestures. I started this project thinking I'd creating a gesture system but decided to narrow the scope to hand tracking interactions that replicate how we typically interact with machines, which is most frequently via touch.

Diegetic UIs

The UIs I created for this demo aren't specific to hand tracking: I just wanted to have some fun using a few interaction tricks I learned this year. I built a notched slider for swiping through a hologram UI, a physics button using a configurable joint, and a pseudo biometric scanner.
Notched, swipable hologram Slider
Pushing buttons
A pseudo biometric scanner to lift the Teleport Pod

Tinkering With Unity's VFX Graph

I absolutely love particle systems, and I've been wanting to work with VFX graph for over a year. So I used this project as an excuse to watch a bunch of tutorial videos, learn my way around the system, and implement a few VFX graph assets in a project.

As usual, I took the more difficult route: VFX Graph isn't fully functional for a Universal Render Pipeline project that doesn't use the Vulkan graphics API. So I spent a few hours wondering why my VFX assets weren't rendering correctly in the headset, but I eventually found a few fixes.

I didn't go too crazy with the VFX--need to keep things reasonable for mobile VR--but I was happy with the results. And I do think VFX graphs are much more performant when you need hundreds of particles for mobile VR.
My Favorite: the VFX sequence After pressing the teleport button
Light rays emitted through the hologram UI
very simple Propulsion VFX Emitted from the wrist band