Introducing Hand Tracking
The Hand Tracking and Gesture Recognition feature, in the latest Snapdragon Spaces SDK 0.6.1 release, offers a natural method for user interaction to power advanced headworn AR experiences.
AUGUST 3, 2022
Hand Tracking and Gesture Recognition can introduce natural movement to human-computer interaction, greatly enriching the augmented reality experience for end users. Thanks to existing cameras on AR glasses, hand and finger positions are detected, recognized, and tracked in 3D space where they can then interact with AR content. The ability to use hands to interact directly with digital objects creates an intuitive experience and allows for deeper user engagement, removing the need for less intuitive controllers.
An early version of this feature is available via the Snapdragon Spaces Unity SDK. Developers can now incorporate Hand Tracking and gestures directly into their AR applications.
Hand Tracking: where and when to use it?
As we move from the head down to the heads-up paradigm when using digital content, it’s important to consider user comfort, utility, and usability as a cornerstone when designing augmented reality apps. Hand Tracking provides natural interactions that appeal to many users, making it a flexible component of truly immersive experiences.
Why is Hand Tracking so well received by users? The answer is simple: seeing your own hands takes the cognitive load off the brain, helping you locate your position in space. This, in turn, leads to a decreased learning curve and allows you to interact with both digital and physical surroundings in a way that feels natural.
“We envision a future where AR glasses are ubiquitous and people carry them around easily, without the need for motion controllers. For that to happen, the control mechanics need to feel natural and pleasant to use” – says Javier Davalos, founder at OverlayMR and lead at Figmin XR.
“The Figmin XR team has been working on this challenge for a long time, and we are happy to report that we have succeeded in our quest. For us, implementing hand tracking was a huge challenge, not only did we have to design the control mechanics from scratch, but also drive all the functionality from the tracking points that the system provides. Eventually, the effort was totally worth it.”
“Snapdragon Spaces developers will have a much easier time since much of this will be provided by the SDK. Controlling Figmin XR with just your hands feels magical and completely natural, so much so that at this point we prefer using it over motion controllers.”
Hand tracking and gesture recognition is universally applicable in a multitude of use cases, from industrial settings to gaming. Some examples include:
- Simplifying interfaces and workflow navigation in industrial settings and for enterprises
- Enhancing AR tabletop gaming through natural interactions with nearby virtual objects
- Boosting virtual collaboration when interacting with 3D models and prototypes in engineering, design, AEC and sales
- Driving engagement and conversion rates in E-commerce and showrooms (think digital try-ons)
- Lowering the interaction barrier while improving the service quality in healthcare and telehealth
Now, that you are more familiar with the benefits of Snapdragon Spaces Hand Tracking, where can you learn more to start implementing the feature for your AR app? Head over to our documentation to find:
- Hand Tracking Best Practices that will help you design immersive and user-friendly experiences using hand gestures and interactions with virtual objects and interfaces.
- Dive deeper using A Guide to Interaction Gestures to learn more about the most common gestures used in XR development.
Ready to implement Hand Tracking using Unity engine?
From gaming to enterprise and beyond, Hand Tracking and Gesture Recognition are powering some of the most engaging AR experiences. Being a truly versatile feature, Hand Tracking allows AR developers to easily create new and memorable user experiences.
Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.