Showcasing XR enterprise solutions at AES 

Snapdragon Spaces Blog

Showcasing XR enterprise solutions at AES

Bringing together the most prominent players in the enterprise space, AES (Augmented Enterprise Summit) took place in October in San Diego. Snapdragon Spaces team used the opportunity to connect with the audience and show advancements of our ecosystem developers.

November, 11, 2022

The event’s theme this year highlighted how selected enterprise players use immersive technologies to enable remote operations, improve training, increase safety, reduce costs, and enhance the customer experience. To kick off the first day of the event, Brian Vogelsang (Sr. Director, XR Product Management, Qualcomm Technologies, Inc.) gave a keynote presentation where he explained how Snapdragon technology brings the best-in-class connectivity across industries, enabling enterprise solutions through a decade long innovation in XR.
“Enterprise AR is starting to accelerate from the innovations that we are helping to build to create lightweight and lower power smart glasses and better form factor. We are seeing enterprise VR also starting to take off and are really excited about the possibilities and use cases” – highlighted Vogelsang.

“With Snapdragon Spaces, we are allowing developers to take the applications developed in VR or MR and bring them over to optical see-through glasses. If you are an ISV or IT company investing in augmented reality experiences, you can start working on AR and bring it to VR device and vice versa. This is a unique thing that has not been done in the industry and we are excited to enable it”.

Proving those points, the Snapdragon Spaces booth showcased four demos of our partners that represented the entire spectrum of XR enterprise solutions – from remote rendering to AR instruction, learning & development, digital collaboration, workflow guidance and remote assistance. Each company has been successfully using Snapdragon Spaces to streamline the development of their applications and bring value to the users.

Here’s how our partners make it easier to create enterprise solutions that meet business needs:

  • holo|one presented Sphere, a mixed reality collaboration and productivity platform for enterprises that takes away the need for expensive customization. Instead, the platform offers seamless integration of turnkey AR into business processes.

  • Holo-Light is on a mission to build a powerful streaming platform for enterprises to leverage immersive experiences, 3D content and scale AR/VR use cases – all while using Lenovo ThinkReality A3 smart glasses.

  • Arvizio featured their AR Instructor – a no-code remote assistance solution that uses the “see-what-I-see” principle to overlay visual instructions directly into the worker’s field of view. The platform allows displaying various forms of content – from video clips to images, 3D models, and even real-time 3D annotations.

  • VictoryXR presented their solution for immersive education and training, equipping users with an X-ray-like vision to study human anatomy in a virtual environment.

The Augmented Enterprise Summit provided a unique opportunity for developers, leaders and innovators across the verticals to discuss the latest trends and developments in the field of extended reality. We are looking forward to integrating the insights gained into the next iterations of Snapdragon Spaces.

Snapdragon Spaces Developer Team

Snapdragon and Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.


XR and 5G mmWave Hackathon Challenge in Tampere

Snapdragon Spaces Blog

XR and 5G mmWave Hackathon Challenge in Tampere, Finland

For three days, the Finnish city of Tampere became the hub of the university and start-up developers, who took part in the first-ever Snapdragon Spaces hackathon. Participants had the chance to use state-of-art smart glasses powered by Snapdragon to design new use cases and leverage 5G mmWave connectivity and extended reality (XR).

OCTOBER, 24, 2022

Organized by Qualcomm Europe, the City of Tampere, Elisa Networks, Nokia, CGI, and Ultrahack, the hackathon presented a unique opportunity for local developers to compete and make the best out of the future “Europe’s fastest city” 5G mmW rich data infrastructure, including the iconic Nokia arena. Completed with ten independent teams of developers with various profiles, the event set out multiple goals. First, to enable developers to unlock capacities of fast mmW network. Secondly, to let participants apply XR as a new UI medium and use open data sets the city of Tampere has provided — all while using Snapdragon Spaces™ SDK and DevKit consisting of Lenovo ThinkReality A3 smart glasses and Motorola edge+ smartphone. With just two days and no prior experience working with Snapdragon Spaces SDK, all the teams could develop and demonstrate the potential of XR and 5G mmWave capabilities through innovative project ideas.

The hackathon started on Friday with onboarding, an opening session, and focused mentoring. Participants were briefed on the pitch and expected deliverables on the second day. Afterward, sessions interchanged with mentoring, checkpoints, and visiting the Nokia area for the 5G area for mmWave testing. Sunday, the event’s final day, was spent at the arena, where participants had a chance to do the final testing, get the mentor’s input and prepare for the pitch and presentation round. The jury consisted of Steve Lukas, Director of Product Management for Qualcomm Technologies, Inc., Pekka Sundstorm, Nokia Head of Strategy Execution Europe, Teppo Rantanen, Executive Director at the City of Tampere, and Taavi Mattila, Business Lead at Elisa.

To succeed at the final pitch and demo, participating teams were asked to present concepts that will use the Tampere city data set over fast connectivity including 5G mmW, integrate XR, stand out from already existing solutions and bring additional value to the target audience. The teams were also expected to think through the ease of deployment and utilize Snapdragon Spaces SDK and Hardware Developer Kit, apart from other criteria. Two out of three winning teams leveraged user-generated content in their use cases, while all of them utilized open datasets made available by the city of Tampere. Tampere’s 3D “twin-city” model was particularly popular. Teams thought through how they would utilize 5G mmW both for data retrieval and content upload of user-generated and other content at scale in a multi-user, city-wide deployment.

The concept of Tampere xRT revolved around city location-based art installation and platform. The team delivered an impressive combination of pitch, idea, and implementation. Whispers of Tampere presented the idea of sneaking audio snippets into the city park, allowing people to interact with the city and each other. Audio messages can be placed at any city location and then discovered and retrieved by others. The third winner, AR Digital Spaces presented an idea of dynamic AR showrooms with an immersive ad-based concept. The jury was impressed with an effective combination of various technologies in one app that promised to make ads more engaging and purposeful. Another two participating teams received honorary mentions: ARenify for their promise to transform sports experiences and Public Commuter for the concept aimed to make the city transport network more accessible and inclusive.

All the winning concepts involved integration with XR, cloud-hosted UGC, fast connectivity including 5G mmW for at-scale, city-wide use, as well as audio recording features on Android, Azure Kinect, and web-based portals for managing content. The first of the many Snapdragon Spaces hackathons to come, the event has provided an immense learning experience for all parties in co-designing for the upcoming Metaverse era. The City of Tampere is already exploring the possibility of leveraging some of the ideas and teams as part of many smart city activities. At the same time, the project participants plan to apply for European funding to help expand the 5G mmWave network outside the arena and explore bringing hackathon ideas to the citizens of Tampere.

Snapdragon Spaces Developer team in collaboration with Qualcomm Europe

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.


Expanding Snapdragon Spaces platform support to include MR and VR

Snapdragon Spaces Blog

Expanding Snapdragon Spaces platform support to include MR and VR

SEPTEMBER 27, 2022

When we launched Snapdragon Spaces at AWE in June 2022, we set out to create a platform and ecosystem for headworn augmented reality that empowers developers to create immersive AR experiences built on open standards. Inspired by the excitement and innovation we’ve seen from you all building with Snapdragon Spaces worldwide, we are proud to be working with this innovative community shaping the future of the medium. While it’s still very early in the journey to a world where spatial computing is interwoven in our daily lives, the pace is accelerating each day, and Qualcomm is more committed than ever to making this our collective reality.

Snapdragon Spaces is evolving quickly, with 8 releases thus far in 2022 we are bringing new features every six weeks like Hand Tracking, persistent Local Anchors, and Hit Testing. We really value the feedback from the developer community, and want to hear your ideas – so please continue to share your thoughts and upvote features on the roadmap. We have also launched a Discord server for the community to connect and share. Feel free to stop by — we would love to learn more about you and your work in XR.

While we started our work with Snapdragon Spaces in augmented reality, our team knew that XR was evolving rapidly. And this would mean that virtual and augmented reality devices would begin to converge with the addition of video passthrough, or mixed reality (MR) features coming to VR headsets. Mixed reality enables experiences that blend augmented and virtual reality by adding stereo color cameras to a virtual reality headset and passing the video from those cameras through to the VR displays. This is a difficult problem to solve, and we have been working for multiple years optimizing camera and video image processing pipelines in Snapdragon-based XR Platforms to enable the highest quality and lowest latency mixed reality experiences.

For developers, mixed reality means that they will be able to create experiences on VR headsets that are much more like the AR experiences on optical see-through devices such as the Lenovo ThinkReality A3 AR glasses in the Snapdragon Spaces Dev Kit. In fact, we believe that developers should be able to have portability between the two. We are excited to share that Snapdragon Spaces will be adding support for mixed and virtual reality in early 2023. Our vision is to allow you as a developer to build mixed reality experiences on Snapdragon Spaces for VR/MR headsets and bring the same application to AR glasses or vice versa.

This will enable developers building Snapdragon Spaces experiences on augmented reality glasses to expand the number of devices they can reach with their app to include VR headsets with mixed reality. It will also allow VR developers who start experimenting with mixed reality to run the same application on augmented reality glasses. We believe this is an industry first and we are excited to be enabling this capability for Snapdragon Spaces developers soon.

Look out for more details about the VR/MR headsets we will be supporting with Snapdragon Spaces in the coming months, as well as information about how and when you can get started. To stay informed (if you are not already subscribed), please join our mailing list and Discord server. Thank you for your enthusiasm for XR, support for Snapdragon Spaces, and commitment to supporting open platforms and ecosystems in spatial computing.

Brian Vogelsang
Sr. Director, XR Product Management, Qualcomm Technologies, Inc.

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.


Introducing Hand Tracking 

Snapdragon Spaces Blog

Introducing Hand Tracking

The Hand Tracking and Gesture Recognition feature, in the latest Snapdragon Spaces SDK 0.6.1 release, offers a natural method for user interaction to power advanced headworn AR experiences.

AUGUST 3, 2022

Hand Tracking and Gesture Recognition can introduce natural movement to human-computer interaction, greatly enriching the augmented reality experience for end users. Thanks to existing cameras on AR glasses, hand and finger positions are detected, recognized, and tracked in 3D space where they can then interact with AR content. The ability to use hands to interact directly with digital objects creates an intuitive experience and allows for deeper user engagement, removing the need for less intuitive controllers.

An early version of this feature is available via the Snapdragon Spaces Unity SDK. Developers can now incorporate Hand Tracking and gestures directly into their AR applications.

Hand Tracking: where and when to use it?

As we move from the head down to the heads-up paradigm when using digital content, it’s important to consider user comfort, utility, and usability as a cornerstone when designing augmented reality apps. Hand Tracking provides natural interactions that appeal to many users, making it a flexible component of truly immersive experiences.

Why is Hand Tracking so well received by users? The answer is simple: seeing your own hands takes the cognitive load off the brain, helping you locate your position in space. This, in turn, leads to a decreased learning curve and allows you to interact with both digital and physical surroundings in a way that feels natural.

“We envision a future where AR glasses are ubiquitous and people carry them around easily, without the need for motion controllers. For that to happen, the control mechanics need to feel natural and pleasant to use” – says Javier Davalos, founder at OverlayMR and lead at Figmin XR.

“The Figmin XR team has been working on this challenge for a long time, and we are happy to report that we have succeeded in our quest. For us, implementing hand tracking was a huge challenge, not only did we have to design the control mechanics from scratch, but also drive all the functionality from the tracking points that the system provides. Eventually, the effort was totally worth it.”

“Snapdragon Spaces developers will have a much easier time since much of this will be provided by the SDK. Controlling Figmin XR with just your hands feels magical and completely natural, so much so that at this point we prefer using it over motion controllers.”

Use Cases

Hand tracking and gesture recognition is universally applicable in a multitude of use cases, from industrial settings to gaming. Some examples include:

  • Simplifying interfaces and workflow navigation in industrial settings and for enterprises
  • Enhancing AR tabletop gaming through natural interactions with nearby virtual objects
  • Boosting virtual collaboration when interacting with 3D models and prototypes in engineering, design, AEC and sales
  • Driving engagement and conversion rates in E-commerce and showrooms (think digital try-ons)
  • Lowering the interaction barrier while improving the service quality in healthcare and telehealth

Best Practices

Now, that you are more familiar with the benefits of Snapdragon Spaces Hand Tracking, where can you learn more to start implementing the feature for your AR app? Head over to our documentation to find:

Ready to implement Hand Tracking using Unity engine?

Refer to our Hand Tracking Integration Guide and use Basic and Extended samples to fully leverage Hand Tracking components inside the Snapdragon Spaces plugin.

From gaming to enterprise and beyond, Hand Tracking and Gesture Recognition are powering some of the most engaging AR experiences. Being a truly versatile feature, Hand Tracking allows AR developers to easily create new and memorable user experiences.

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.