Sorry, you need to enable JavaScript to visit this website.

Getting Our Hands Dirty with VR Hand Tracking

Profile picture for user Labs.Monks


Getting Our Hands Dirty with VR Hand Tracking

Engrossed in virtual reality, you’re surrounded by digital, fantastic objects, each begging for you to reach out and touch. But until recently, most interaction in mainstream VR headsets has still been limited to using a controller. For some experiences, the controller presents a disconnect between what people feel in their hands versus see on the screen—at least until recently.

Last month, the Oculus released its Hand Tracking SDK for the Oculus Quest, allowing people to use their hands to navigate through menus and applications supporting the new SDK. While the update isn’t meant to replace controllers outright, it enhances users’ sense of presence within the virtual space by blurring the barriers between real and virtual even further, presenting new creative opportunities for brands that are eager to offer assistive content in the emergent medium. “Tangibility in digital has always been equated to a click of a mouse or key, but now it’s becoming even more of a physical thing, more like a real experience,” says Geert Eichhorn, Innovation Director at MediaMonks.

This illusion of reality is intriguing for Seth van het Kaar, Unity Monk at MediaMonks. “One thing VR has shown through experience and research is that our eyes override our other senses,” he says. “So, if I appear to be putting my hand in a bucket of cold water in VR, I’ll get the placebo effect of it feeling cold. Through creativity, you can use that to your advantage.”

Monk Thoughts Tangibility in digital has always been equated to a click of a mouse or key, but now it’s becoming even more like a real experience.
Portrait of Geert Eichhorn

Exploring the creative opportunities presented by the SDK, van het Kaar served as developer on a team of Monks experimenting with hand tracking to develop a working prototype that could take best advantage of the new interface. Here’s what the team learned in the process.

Find Opportunities to Get Hands-On

“Similar to how the development of voice as an interface has prompted brands to emulate human conversation as naturally as possible, we need to make these experiences feel as intuitive as possible, as you’re using your real hands,” says Eichhorn. As part of MediaMonks Labs, our research and innovation team, he’s focused not on using the latest tech for the sake of it, but rather finding the real-world application and value that it has for end-users.

Trying to identify what type experience would best benefit from this new input, the team wondered: what activities are very dexterous and require careful use of one’s hands? Shaving made sense: “It’s something that’s difficult for young adults and teens who are just learning to use these devices,” says Eichhorn. “And a lot of people still get things wrong, like going against the grain.” It’s also an intriguing use case in that shaving requires an element of precision, putting the usability of hand tracking to the test.


Inspired by clay, the Monk head grows noodles of hair that you can shave and trim.

By practicing grooming in VR using one’s own hands, users would be able to try out different tools and techniques without worrying about messing up their own hair. So, the team took our bald monk mascot and blessed him a head of hair, inviting Oculus Quest users to give him a shave and a trim in an experience inspired by the Play-Doh “Crazy Cuts” line of toys.

Start with Something Familiar

Interacting with one’s hands is incredibly intuitive; it’s one of the earliest ways that we engage with the world as infants. But that doesn’t mean any hand-tracking experience is inherently easier to use or design; experimenting with any new mode of interaction requires one to break free of any preconceived notions about design. In the case of hand tracking, how does one organize a series of options within an experience without the use of physical buttons (and in this case, no haptic feedback)?

To rise above the challenge, the team used common hand gestures as a starting point—for example, those used in rock/paper/scissors—to serve as an intuitive metaphor for interaction.  “The Oculus can track the difference between fingertips, so if I mimic scissors with them, that’s a funny interaction,” says van het Kaar. “In the app, you can select the scissors and now you’re like Edward Scissorhands,” a fictional film character whose hands made of scissors give him wild success as a hairstylist.


Move Beyond Limitations and Creative Constraint

In its experiments with the SDK, the team settled on a couple of learnings that could apply to subsequent hand-activated Oculus Quest experiences. First, there’s moving past the challenge felt in any VR environment: locomotion, or the relationship and (de)synchronization between one’s bodily movements and those of their virtual avatar.

Without haptic feedback, what should happen when the user’s hand comes in contact with a virtual object: should it move through the object, or should the object block their movement much like it would in reality? While the latter option might make sense on paper, the fact that users could still move their physical hand while the virtual one stays stationary could result in confusion. The team moved beyond the challenge by letting users push virtual objects freely—for example, the monk model that they shave—which snap back into place once released (which sounds like a fun interaction in its own right).

Monk Thoughts We need to make these experiences feel as intuitive as possible, as you’re using your real hands.
Portrait of Geert Eichhorn

The way that Hand Tracking SDK detects hands also presented a challenge: it seeks out the shape of a hand against a background, so it loses tracking once two of them overlap. “You can’t place a menu on the palm of your hand and tap an option on it, or interact with a virtual object on your wrist, for example,” says van het Kaar. To work around this challenge, a menu floats beside the user’s hand. While this doesn’t allow for haptic feedback by selecting options against one’s own body, this setup mitigates the risk of losing the tracking by having hands overlap.

Taking the time to experiment and apply these learnings allow us to develop increasingly realistic experiences in extended reality. From playing with hand tracking in VR to demonstrating how occlusion transforms experiences in AR, our team of makers are devoted to continually experimenting with new technologies, finding their most relevant use cases and establishing best practices for brands and our partners. As barriers continue to break down between the physical and virtual, it will be exciting to see what kinds of wholly new digital experiences emerge.


Make our digital heart beat faster

Get our newsletter with inspiration on the latest trends, projects and much more.

Media.Monks needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, please review our Privacy Policy.