The first time I experienced my magical moment in VR was when I first entered the Vive and jumped into Tilt Brush. The ability to draw, erase, and manipulate shapes in 3D was sensational; but it was the responsiveness, dexterity, and control I felt with the six degrees of freedom (6DoF) controllers that made the experience magical. There’s something intuitive and convincing about seeing the precise movements of your hands when you’re in a virtual environment.
It’s challenging to replicate that level control in augmented reality (AR) though. AR lets us interact with virtual elements without leaving our physical environment, and we rarely want to hold controllers while we are completing real-world tasks.
AR manufacturers have been presenting a variety of solutions to navigate its interfaces, but the solutions are lacking intuitiveness and the satisfying level of precision that VR experiences have.
The Microsoft HoloLens allows you to gaze at virtual objects by attaching a reticle to the center of your vision. You then use a hand gesture (air tap) or a voice command to make the selection. Although these mechanics work well once you’re used to it, there is a learning curve.
Some devices rely on gaze-based activation. If the user hovers their reticle over a selectable element for a moment, the command starts. Not a big learning curve here, but the simplicity of the mechanic comes with a lack of flexibility and control.
Other devices, such as the Magic Leap, rely on a single-handed controller. Controllers are very responsive and most people who have played video games or operated a TV remote finds the interaction familiar. There is also the extra benefit of haptic feedback. Controllers can vibrate in various ways to simulate an object’s physical presence.
The drawback to the controller is that it occupies your hand and hinders your ability to use it for physical tasks while operating the AR device. It’s acceptable for entertainment, but not ideal for productivity. We need it to be smaller and less obtrusive...
This all brings me to a project I‘m very excited about!
We recently consulted for a company creating a 6DoF AR controller with haptic feedback… in a ring. Yes, all that punch in a minimal device that wraps inconspicuously around your finger.
You wear the ring on your pointer finger, and the ring has a surface you can tap and scroll on using your thumb.
The ring provides the haptics and command input capability while a wristband worn on the same wrist allows for the 6DoF tracking. The device is still in its prototype phase but it will only get smaller and it presents an opportunity for new and interesting ways to interact more naturally with virtual elements.
So far, we’ve been experimenting with different interactions for various menus. One of them is a radial menu that hugs the user’s pointer finger. The user can surface this menu by sliding their thumb upward or downward on the ring. Once surfaced, the menu options will act as a carousel, and the user can select an option by tapping their thumb on the ring.
When moving objects at a distance, users can target objects by using their fingers like a laser pointer. They can then tap and hold on their ring to grab an object, move it, then place it by releasing their thumb from the ring.
The ring also boasts solid haptic capability and can simulate a variety of sensations that gives virtual objects the presence they deserve. We wanted to highlight this feature by surfacing a menu that anchors itself in space and allows the user to poke at each color to select it.
When your finger enters a color sphere, it visually animates like a bubbling orb while the haptic ring simulates a bubbling sensation.
Here is my quick comparison for AR device controller usability. The haptic ring is just a controller, unlike the other devices in the comparison which are full AR headsets. I'm only comparing the UI controller types.
AR Device Controller Usability
As you can see, I already see a large potential for this ring to set an industry standard for AR controllers.
More on this to come.