Categories
All AR

Introducing QR Code Tracking

Snapdragon Spaces Blog

Introducing QR Code Tracking for Snapdragon Spaces

A much-requested feature is now live: QR Code tracking unlocks new opportunities for developers, expanding the possibilities of connected devices enhanced by Extended Reality experiences.

Jan 15, 2024

What is QR Code Tracking?

QR codes are everywhere, from restaurant menus to factory floors. When we see one, we instinctively know what to do: point our device camera, scan the code and get instant access to information.

Scanning QR codes has also been a popular way to access augmented reality, virtual reality and mixed reality experiences, whether it’s to attend a virtual concert, join a WIFI network or get step-by-step instructions to fix your coffee machine.

In addition to easy access, QR codes can also serve as a trigger for XR experiences complementing (or completely replacing, if required) image or object targets, while keeping fast and reliable recognition and tracking to initialize AR, VR and MR experiences.

For developers, the simplified generation of QR Codes skips the hurdle of creating custom image targets and corresponding call to actions, while simultaneously encoding all necessary data to perform the XR-powered task. The universal QR code detection function further facilitates the development and maintenance of XR experiences, without the need for cloud services and frequent app updates.

At its first iteration, the feature supports QR Code versions up to 10 (DIN ISO 18004), enabling recognition, detection and tracking from an approximate distance of 1 meter or more (for targets scaled at a sufficient minimum size) using Snapdragon Spaces supported devices. Developers can benefit from scanning and decoding functions (similar to QR Code reader apps on phones), as well as real time tracking in 6DOF, allowing for hands-free use cases and even collaborative applications with multi-users.

“The recent addition of live QR code tracking from Snapdragon Spaces allows us at Sphere to provide users with a more seamless and precise user experience when it comes to spatial content and co-located collaboration in XR. It is also a great example of Qualcomm’s commitment to helping developers achieve their goals”, says Colin Yao, CTO at Sphere.

Why use QR Code Tracking in your XR experiences?

For the enterprise, Snapdragon Spaces will now unlock the possibility of scalable XR experiences, expanding the connectivity of machinery, factory floor and professionals. Using headworn devices, frontline workers can use QR Code Tracking to distinguish millions of uniquely identifiable instances while performing handsfree training, maintenance and remote assistance tasks.

This feature can equally support customer-focused use cases such as augmented product packaging, XR product display with 3D models, interactive educational materials, XR-powered showrooms and more.

QR Code Tracking can be combined with other features available on the Snapdragon Spaces SDK, such as Local Anchors, Positional Tracking, Plane Detection and more. For example, QR Codes can be used both as a target and as an initial Spatial Anchor (when fixed on a position), helping the user and the engine get spatial orientation in the real world.

Using QR Codes also allows companies to encode a variety of data formats like product serial numbers, machinery information, videos, URLs and more, which can trigger augmentations that will continuously be rendered in 3D.

“With Snapdragon Spaces now supporting QR Code Tracking, this is an important feature for XR applications as it continues to be an easy way to position objects, trigger specific actions or align users in an XR environment. It’s a feature we support with our XR streaming technology and it will make life easier for XR users” adds Philipp Landgraf, Senior Director XR Streaming at Hololight.

Next steps

SDK 0.19 is now available for download at the Snapdragon Spaces Developer Portal. Check out our SDK changelog for Unity and Unreal to learn what’s new and get started with our QR Code Tracking sample.

We’re excited to see how Snapdragon Spaces developers put this new feature to use and look forward to your feedback on Discord!

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
AR Education Entertainment Gaming VR / MR

China XR Innovation Challenge 2023

Snapdragon Spaces Blog

China XR Innovation Challenge 2023: Pushing the Boundaries of XR Development

The China XR Innovation Challenge has become a highly anticipated event in the XR community, showcasing the talent and innovation of developers and OEM partners. The 2023 edition of the contest brought together a record-breaking number of participants, highlighting the rapid growth and advancement of XR technology in China. In this blog post, we recap the highlights of the event and celebrate the winning teams that pushed the boundaries of XR development.

November 27, 2023

Spanning several months (from April to September 2023), the China XR Innovation Challenge 2023 attracted a staggering number of 505 registration. This marked a significant increase compared to previous years, solidifying the contest’s position as the largest XR developer event in China. The event brought together developers, OEM partners, and XR enthusiasts, fostering collaboration and knowledge sharing among industry leaders. The event showcased the latest advancements in XR technology and featured a diverse range of contest tracks. Read on to learn about the key highlights of the event.

Contest Tracks:

Event organizers offered nine designated tracks, catering to various aspects of XR development. These tracks included XR Consumer, XR Enterprise, PICO MR Innovation, PICO Game, YVR Innovation, QIYU MR Innovation, Skyworth XR Innovation, and Snapdragon Spaces Tracks. A diverse range of tracks allowed participants to explore different areas of XR development and showcase their creativity and skills.

New for this year was a dedicated Snapdragon Spaces track that saw more than 40 entries. Selected developers were given development kits to work with which resulted in 18 Spaces apps submitted for consideration by judges.
Contestants created their apps on a variety of development kits powered by Qualcomm XR processors:


Technical Webinars:

To support the contestants and provide valuable insights, the event organizers conducted a series of eight technical webinars. These webinars covered topics such as VR/AR registration, 3D engine utilization, VR game development, and VR sports fitness. The webinars attracted a significant number of attendees, totaling over 21.2K views, highlighting the thirst for knowledge and expertise in the XR community.

Awards Ceremony:

The contest culminated in a grand final awards ceremony held in Qingdao. The ceremony brought together 132 attendees, including contestants, judges, industry experts, and XR enthusiasts. Together, we celebrated the achievements of the participants and recognized the most impressive and inspiring projects in the XR field.

Winning Teams:

The contest team recognized 41 projects as the most impressive and inspiring in their respective tracks. These winning teams showcased their innovation, creativity, and technical expertise. Here are a few examples of the selected finalist projects:

  • Project 1: “Finger Saber” This project introduced a new gesture recognition experience in VR gaming, allowing players to control a lightsaber using hand gestures. The innovative control mechanism added a new level of immersion and interactivity to the gaming experience.

  • Project 2: “End of the Road” was a cross-platform multiplayer cooperative game set in a post-apocalyptic world. Players worked together as outlaws to destroy evil forces and protect valuable resources. The game showcased the power of collaboration and teamwork in an immersive XR environment.

  • Project 3: “EnterAR” was an AR tower defense shooting game that combined cartoon-style graphics with intuitive gameplay. Players defended a crystal from alien creatures using a bow and arrow, leveraging the power of AR technology to create an engaging and immersive experience.

  • Have a look at the video below for an overview of the winning projects:




It’s fair to conclude that China XR Innovation Challenge 2023 was a resounding success, showcasing the immense talent and innovation in the XR community. The event provided a platform for developers, OEM partners, and XR enthusiasts to come together, exchange ideas, and push the boundaries of XR development. The winning teams demonstrated the incredible potential of XR technology and its applications across various industries. As the XR landscape continues to evolve, events like this play a crucial role in driving innovation and shaping the future of XR in China.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
All AR

Qualcomm Joins the Mixed Reality Toolkit (MRTK) Steering Committee

Snapdragon Spaces Blog

Qualcomm Joins the Mixed Reality Toolkit (MRTK) Steering Committee

Mixed Reality Toolkit (MRTK) was originally conceived as a cross-platform open-source project by Microsoft to provide common XR functionality to developers. You may have heard the announcement that MRTK has been moved to its own independent organization on GitHub headed by a steering committee. We’re excited to announce that Qualcomm Technologies, Inc., along with other ecosystem players, is joining this committee.

October 4, 2023

As leaders in the XR space with solutions like our Snapdragon Spaces™ XR Developer Platform, we look forward to shaping the future of MRTK to keep it alive and thriving, while expanding on its ecosystem. Snapdragon Spaces helps developers build immersive XR experiences for lightweight headworn displays powered by Android smartphones. Key features of Snapdragon Spaces include Positional Tracking, Local Anchors, Plane Detection, and more. This year we announced an extended support across XR spectrum, along with Dual Render Fusion feature designed to help developers transform their 2D mobile applications into spatial 3D experiences.

We have a vested interest in MRTK as it can be integrated into projects built with our Snapdragon Spaces SDK for Unity. MRTK is largely a UI and interaction toolkit while Snapdragon Spaces provides device enablement and support. In the following diagram, Snapdragon Spaces sits at the same layer as Unity’s XRI Toolkit and provides access to device functionality (e.g., subsystems) that MRTK can build upon.



There is a strong community around MRTK that spans hobbyists to enterprise, which enables rapid development, consistent UI, and cross-platform deployment.
As part of this, we’re also excited to announce some of the original architects and project team members from the MRTK project have joined the Snapdragon Spaces team. Their knowledge and expertise in the XR domain is sure to help drive advancements in both MRTK and Snapdragon Spaces.



Pictured: Kurtis Eveleigh, Nick Klingensmith, Caleb Landguth, Isa Cantarero, Ramesh Chandrasekhar, Dave Kline, Brian Vogelsang, Steve Lukas

Qualcomm Technologies, Inc. is excited to be a part of MRTK’s steering committee and acquiring the new team members. Recently, the steering committee released GA on September 6, 2023. Going forward, Qualcomm Technologies, Inc. will continue to contribute code and functionality so that all stakeholders may benefit. We aim to ensure that the architecture maintains its cross-platform vision and promise, while providing best-in-class support for Snapdragon processors and Snapdragon Spaces.


For more information, check our documentation and our MRTK3 Setup Guide which shows how to integrate MRTK3 into a Unity project.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
All AR Dual Render Fusion

Dual Render Fusion is now available for mobile developers

Snapdragon Spaces Blog

AR’s inflection point: Dual Render Fusion feature is now available for mobile developers

Last week at AWE 2023, we introduced Dual Render Fusion – a new feature of Snapdragon Spaces™ XR Developer Platform designed to help developers transform their 2D mobile applications into spatial 3D experiences with little prior knowledge required.

June 6, 2023

What is Dual Render Fusion?

Snapdragon Spaces Dual Render Fusion enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality. In practical terms, a smartphone acts as a controller for AR experiences, letting users select what they want to see in AR using familiar mobile UI and gestures. Imagine you are using your go-to maps app for sightseeing. With Dual Render Fusion, you can use the phone as usual to browse the map and at the same time, see a 3D reconstruction of historical places in AR.

Why use Dual Render Fusion in your AR experiences?

The feature makes it easier for developers to extend their 2D mobile apps into 3D spatial experiences without creating a new spatial UI. It’s also the first time in the XR industry when AR developers get the tool to combine multi-modal input with simultaneous rendering to smartphones and AR glasses. With Dual Render Fusion, Unity developers with little to no AR knowledge can easily add an AR layer to their existing app using just a few extra lines of code. The feature gives more control over app behavior in the 3D space, significantly lowering the entry barrier to AR. But that’s not all – while using the feature, you have the option to utilize all available inputs enabled with the Snapdragon Spaces SDK, including Hand Tracking, Spatial Mapping and Meshing, Plane Detection, and Image Tracking, or go all in utilizing the convenience of the mobile touch screen for all input.

Why it is important?

It takes a lot of learning for developers to break into mobile AR and even more so to rethink what they already know to apply spatial design principles to headworn AR. The same applies to the end users as they need to get familiar with new spatial UX/UI and input. Enabling the majority of developers to create applications that are accessible to smartphone users will unlock the great potential of smartphone-based AR. The Snapdragon Spaces teams have been working hard to reimagine smartphone AR’s status quo and take a leap forward to fuse the phone with AR glasses. The Dual Render Fusion feature allows just that – to blend the simplicity and familiarity of the smartphone touch screen for input while leveraging the best of augmented reality. Dual Render Fusion unlocks smartphone-powered AR to its full potential, allowing us to onboard and activate an untapped market of millions of mobile developers. Hear our Director of XR Product Management Steve Lukas explain the vision behind the groundbreaking feature:


Download now

Dual Render Fusion (*experimental) is now available in beta as an optional add-on package for Snapdragon Spaces SDK for Unity version 0.13.0 and above. Download today and browse our Documentation to learn more. Don’t forget to share your feedback and achievements with our XR community on Discord.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
AR

Introducing Hand Tracking 

Snapdragon Spaces Blog

Introducing Hand Tracking

The Hand Tracking and Gesture Recognition feature, in the latest Snapdragon Spaces SDK 0.6.1 release, offers a natural method for user interaction to power advanced headworn AR experiences.

AUGUST 3, 2022

Hand Tracking and Gesture Recognition can introduce natural movement to human-computer interaction, greatly enriching the augmented reality experience for end users. Thanks to existing cameras on AR glasses, hand and finger positions are detected, recognized, and tracked in 3D space where they can then interact with AR content. The ability to use hands to interact directly with digital objects creates an intuitive experience and allows for deeper user engagement, removing the need for less intuitive controllers.

An early version of this feature is available via the Snapdragon Spaces Unity SDK. Developers can now incorporate Hand Tracking and gestures directly into their AR applications.

Hand Tracking: where and when to use it?

As we move from the head down to the heads-up paradigm when using digital content, it’s important to consider user comfort, utility, and usability as a cornerstone when designing augmented reality apps. Hand Tracking provides natural interactions that appeal to many users, making it a flexible component of truly immersive experiences.

Why is Hand Tracking so well received by users? The answer is simple: seeing your own hands takes the cognitive load off the brain, helping you locate your position in space. This, in turn, leads to a decreased learning curve and allows you to interact with both digital and physical surroundings in a way that feels natural.

“We envision a future where AR glasses are ubiquitous and people carry them around easily, without the need for motion controllers. For that to happen, the control mechanics need to feel natural and pleasant to use” – says Javier Davalos, founder at OverlayMR and lead at Figmin XR.

“The Figmin XR team has been working on this challenge for a long time, and we are happy to report that we have succeeded in our quest. For us, implementing hand tracking was a huge challenge, not only did we have to design the control mechanics from scratch, but also drive all the functionality from the tracking points that the system provides. Eventually, the effort was totally worth it.”

“Snapdragon Spaces developers will have a much easier time since much of this will be provided by the SDK. Controlling Figmin XR with just your hands feels magical and completely natural, so much so that at this point we prefer using it over motion controllers.”

Use Cases

Hand tracking and gesture recognition is universally applicable in a multitude of use cases, from industrial settings to gaming. Some examples include:

  • Simplifying interfaces and workflow navigation in industrial settings and for enterprises
  • Enhancing AR tabletop gaming through natural interactions with nearby virtual objects
  • Boosting virtual collaboration when interacting with 3D models and prototypes in engineering, design, AEC and sales
  • Driving engagement and conversion rates in E-commerce and showrooms (think digital try-ons)
  • Lowering the interaction barrier while improving the service quality in healthcare and telehealth

Best Practices

Now, that you are more familiar with the benefits of Snapdragon Spaces Hand Tracking, where can you learn more to start implementing the feature for your AR app? Head over to our documentation to find:

Ready to implement Hand Tracking using Unity engine?

Refer to our Hand Tracking Integration Guide and use Basic and Extended samples to fully leverage Hand Tracking components inside the Snapdragon Spaces plugin.

From gaming to enterprise and beyond, Hand Tracking and Gesture Recognition are powering some of the most engaging AR experiences. Being a truly versatile feature, Hand Tracking allows AR developers to easily create new and memorable user experiences.

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.