Categories
Gaming

Meet some of the 80+ Companies in the Snapdragon Spaces Ecosystem

Snapdragon Spaces Blog

Meet some of the 80+ Companies in the Snapdragon Spaces Ecosystem

At AWE 2023 we announced that our Snapdragon Spaces Pathfinder Program has grown to over 80 member companies. This program helps qualifying developers succeed through early access to platform technology, project funding, co-marketing and promotion, and hardware development kits.

September 18, 2022

But it’s not just program members who make up the ecosystem behind our Snapdragon Spaces™ XR Developer Platform. Qualcomm Technologies is collaborating with an ecosystem of leading global operators and smartphone OEMs to bring headworn XR experiences to market across devices and regions around the world. In conjunction with 3D engine developers and content creators building immersive XR content, this ecosystem of companies is ushering in the next generation of XR headsets and experiences to the market.

Several of these companies have publicly announced how they’re collaborating with the platform and use Snapdragon Spaces to build apps for headworn XR. With so many projects spanning a wide range of verticals and use cases, you can’t help but become inspired. So, we decided to share the links to these announcements, organized by vertical, hoping they’ll inspire your next XR project.

Enterprise: Collaboration

  • Arthur Digital : Immersive collaborative products, including virtual whiteboards, screen sharing, and real-time spatial communication for distributed teams. Use cases include learning and development, and workshops.

  • Arvizio : Their AR Instructor product offers live see-what-I-see videos and interactive mark-ups in AR, enabling instructions to be overlaid in the user’s field of view.

  • Holo-Light : Company’s ISAR SDK offloads real-time rendering of compute-intensive images in AR/VR apps to powerful cloud infrastructure or local servers.

  • Lenovo : Lenovo is working with several XR developers to make cutting-edge applications powered by Snapdragon Spaces technology available on the ThinkReality VRX. Use cases today include immersive training as well as collaboration in 3D environments.

  • Pretia : The company is porting its MetaAssist mobile app to AR glasses.

  • ScopeAR : Frontline workers use the company’s WorkLink AR app for employee training, product and equipment assembly, maintenance and repair, field and customer support, and more.

  • Sphere : The company creates AR environment where remote teams feel like they are working in the same room through highly-realistic avatars, real-time spatial-audio language translations, and integrations with conventional conferencing

  • Taqtile : Their Manifest application enables deskless workers to access AR-enabled instructions and remote experts to perform tasks more efficiently and safely. It supports an expanding number of heads-up displays (HUDs), so enterprise customers can select hardware platforms based on their requirements

Training

  • DataMesh : Digital twin content creation and collaboration platform makes digital twins more accessible to frontline workers and creators while addressing workflow challenges in training, planning, and operations.

  • DigiLens : Waveguide display technologies and headworn smartglasses for enterprise and industrial use cases, powered by the Snapdragon XR2 5G Platform .

  • Pixo VR : Simplifies access to and management of XR content. The platform can host any XR content, works on all devices, and offers a vast library of off-the-shelf VR training content.

  • Roundtable Learning : VR headsets for immersive enterprise training solutions.

  • Uptale : Immersive learning ranging from standard operating procedures and security rules to quality control and managerial situations, and onboarding.

Productivity

  • Nomtek : Their StickiesXR project is a Figma prototype that transforms traditional 2D workflows into XR experiences.

Education

  • AWE 2023 award winner PhiBonacci : Provides hands-on training that is safe, efficient, and impactful for learning and working in medical, engineering, and industry 4.0 fields.

WebXR

  • Wonderland: a development platform for VR, AR and 3D on the web, now built on Snapdragon Spaces to expand cooperation with platform and hardware providers.

  • Wolvic : Offers a multi-device, open-source web browser for XR.

Gaming and Entertainment

  • Mirroscape : MR technology and AR glasses to enable tabletop games in XR. Maps and miniatures are anchored to players’ tables just like real objects, providing them with unique and immersive views from virtually anywhere.

  • Skonec Entertainment : Develops XR content that enables efficient and sophisticated spatial experiences, particularly in XR game development.

Health, Wellness, and Fitness

  • Kittch: Provides interactive cooking instructions and other actions (e.g., set timers) using AR glasses, hand tracking, and eye tracking, thus freeing users’ hands to perform the necessary cooking steps.

  • XR Health : Immersive VR, AR, or MR healthcare experiences, including training and collaboration in 3D, on devices powered by Snapdragon.

Others

  • ArborXR: Provides XR management software to Qualcomm Technologies’ XR customers and OEMs. Organizations can manage device and app deployment and lock down the user experience with a kiosk mode and a customizable launcher.

  • Echo3D : Cloud-based platform that streamlines the distribution of XR software and services.

  • LAMINA1 : Blockchain optimized for the open metaverse, including novel approaches that leverage blockchain, NFTs, and smart contracts to bridge virtual and real-world environments (e.g., next-generation ticketing and loyalty programs).

  • OPPO MR Glass : OPPO MR Glass Developer Edition will become the official Snapdragon Spaces hardware developer kit in China.

  • TCL Ray Neo : The company will upgrade their RayNeo’s AR wearables with spatial awareness, object recognition and tracking, gesture recognition, and more, creating new possibilities in smart home automation, indoor navigation, and gaming.

Get Started

Initially released in June, 2022, and regularly updated (see our changelog) , Snapdragon Spaces provides developers with the versatility to build AR, MR, and VR apps across enterprise and consumer verticals. Developers, operators, and OEMs are free to monetize globally through existing Android-based app store infrastructure on smartphones. With proven technology and a cross-device open ecosystem, Snapdragon Spaces gives developers an XR toolkit to unlock pathways to consumer adoption and monetization.

You can get started on your headworn XR app today with the following resources:


Snapdragon Spaces Developer Team

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Layering spatial XR experiences onto mobile apps with Dual Render Fusion

Snapdragon Spaces Blog

Layering spatial XR experiences onto mobile apps with Dual Render Fusion

We are still buzzing with excitement around our announcement of Dual Render Fusion at AWE 2023 – the new feature of Snapdragon Spaces™ XR Developer Platform. We’ve got a number of developers already building apps with Dual Render Fusion and can’t wait to see what you will create. But first, let us show you how it works.

June 28, 2023

Think about transforming your 2D mobile application into a 3D spatial experience. Dual Render Fusion allows a Snapdragon Spaces based app to simultaneously render to the smartphone and a connected headworn display like the ThinkReality A3 included in the Snapdragon Spaces dev kit. In this setup, the smartphone can be used as a physical controller and a primary display (e.g., to render user interfaces and conventional 3D graphics). The smartphone is also connected to a headworn XR display which provides a secondary spatial XR view in real-time.

From a technical standpoint, the app now has two cameras (viewports) rendering from the same scene graph in a single Activity. The image below shows how developers can now enable two cameras in a Unity project with Dual Render Fusion by selecting each Target Eye in the Game Engine Inspector:



Once enabled, you can then preview the cameras. As shown below, Dual Render Fusion running in Unity on an emulator and how perception can be effectively layered onto the 2D smartphone experience.



The left viewport shows the 3D scene view in Unity. The middle shows a simulation of the primary (on-smartphone) display with the 3D scene and user interface controls. The right viewport simulates the rendering for the secondary (headworn XR) display, in this case, with simulated Hand Tracking enabled. The simplicity of this is that the cube can be manipulated by the smartphone touchscreen controls or by perception-based hand tracking without any networking or synchronization code required.

Layering on spatial experiences for your users

Today, users are comfortable with their current smartphone experiences. But just as the first smartphones drove a paradigm shift in human machine interaction (HMI), we’re now at an inflection point where XR is driving a shift towards spatial experiences which enhance users’ daily lives.
That’s why we believe Dual Render Fusion is so important. You can use it to introduce your users to spatial concepts while taking full advantage of existing smartphone interactions. Apps can now be experienced in new immersive ways while retaining familiar control mechanisms. Best of all, your users don’t have to give up their smartphones. Instead, they can reap the benefits of spatial XR experiences that you’ve layered on.

A great example of this is Virtual Places by mixed.world , demoed at our AWE 2023 booth. Using Snapdragon Spaces and Dual Render Fusion, Virtual Places enhances conventional 2D map navigation with a 3D spatial view. Users can navigate maps with their smartphone while gaining visual previews of how an area actually looks. The multi-touch interface allows a familiar pinch and zoom interaction to manipulate the maps view on the phone, while a 3D visualization provides an immersive experience in the glasses.


Table Trenches by DB Creations , also demoed at our AWE 2023 booth, provided an example of a game that integrated Dual Render Fusion with relative ease.

DB Creations co-founder Blake Gross had this to say about the process: “…with Fusion, we are able to create experiences that anyone, without AR experience, can pick up and use by utilizing known mobile interaction paradigms. With Table Trenches, this was especially useful, because we were able to take the UI out of the world and into a familiar touch panel. Additionally, Fusion enables the smartphone screen to be a dynamic interface, so we can change how the phone behaves. We do this in Table Trenches when switching between surface selection and gameplay. Fusion was easy to integrate into our app since it uses the familiar Unity Canvas. Since our game was already built to utilize the Unity Canvas, it was as simple as setting up a new camera, and reconfiguring the layout to best fit the Snapdragon Spaces developer kit. We noticed at AWE how easy it was for new users to pick up and play our game without us needing to give any manual explanation of what to do.”

Developing With Dual Render Fusion

With the number of global smartphone users estimated at around 6.6 billion in 2031, there is a large market for mobile developers to tap into and offer new experiences. Snapdragon Spaces and Dual Render Fusion facilitate rapid porting of existing Unity mobile apps and demos to XR. You can create or extend existing 2D mobile applications built with Unity into 3D XR experiences with little or no code changes required just to get started. The general process goes like this:

  • Create a new 3D or 3D URP project in Unity.
  • Import the Snapdragon Spaces SDK and Dual Render Fusion packages.
  • Configure the settings for OpenXR and Snapdragon Spaces SDK integration.
  • Use the Project Validator to easily update your project and scene(s) for Dual Render Fusion with just a few clicks.
  • Build your app.

You can read more about the process in our Dual Render Fusion Scene Setup guide for Unity.

Download now

Ready to start developing with Snapdragon Spaces and Dual Render Fusion? Get started with these three steps:


Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Dual Render Fusion is now available for mobile developers

Snapdragon Spaces Blog

AR’s inflection point: Dual Render Fusion feature is now available for mobile developers

Last week at AWE 2023, we introduced Dual Render Fusion – a new feature of Snapdragon Spaces™ XR Developer Platform designed to help developers transform their 2D mobile applications into spatial 3D experiences with little prior knowledge required.

June 6, 2023

What is Dual Render Fusion?

Snapdragon Spaces Dual Render Fusion enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality. In practical terms, a smartphone acts as a controller for AR experiences, letting users select what they want to see in AR using familiar mobile UI and gestures. Imagine you are using your go-to maps app for sightseeing. With Dual Render Fusion, you can use the phone as usual to browse the map and at the same time, see a 3D reconstruction of historical places in AR.

Why use Dual Render Fusion in your AR experiences?

The feature makes it easier for developers to extend their 2D mobile apps into 3D spatial experiences without creating a new spatial UI. It’s also the first time in the XR industry when AR developers get the tool to combine multi-modal input with simultaneous rendering to smartphones and AR glasses. With Dual Render Fusion, Unity developers with little to no AR knowledge can easily add an AR layer to their existing app using just a few extra lines of code. The feature gives more control over app behavior in the 3D space, significantly lowering the entry barrier to AR. But that’s not all – while using the feature, you have the option to utilize all available inputs enabled with the Snapdragon Spaces SDK, including Hand Tracking, Spatial Mapping and Meshing, Plane Detection, and Image Tracking, or go all in utilizing the convenience of the mobile touch screen for all input.

Why it is important?

It takes a lot of learning for developers to break into mobile AR and even more so to rethink what they already know to apply spatial design principles to headworn AR. The same applies to the end users as they need to get familiar with new spatial UX/UI and input. Enabling the majority of developers to create applications that are accessible to smartphone users will unlock the great potential of smartphone-based AR. The Snapdragon Spaces teams have been working hard to reimagine smartphone AR’s status quo and take a leap forward to fuse the phone with AR glasses. The Dual Render Fusion feature allows just that – to blend the simplicity and familiarity of the smartphone touch screen for input while leveraging the best of augmented reality. Dual Render Fusion unlocks smartphone-powered AR to its full potential, allowing us to onboard and activate an untapped market of millions of mobile developers. Hear our Director of XR Product Management Steve Lukas explain the vision behind the groundbreaking feature:


Download now

Dual Render Fusion (*experimental) is now available in beta as an optional add-on package for Snapdragon Spaces SDK for Unity version 0.13.0 and above. Download today and browse our Documentation to learn more. Don’t forget to share your feedback and achievements with our XR community on Discord.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Introducing Spatial Mapping and Meshing

Snapdragon Spaces Blog

Introducing Spatial Mapping and Meshing

Our 0.11.1 SDK release introduced the Spatial Mapping and Meshing feature to the Snapdragon Spaces™ developer community. Read on to learn more about our engineering team approach, use cases and important factors to consider when getting started.

April 7, 2023

What is Spatial Mapping and Meshing?

Spatial Mapping and Meshing is a feature of Snapdragon Spaces SDK that provides users with an approximate 3D environment model. This feature is critical in helping smart glasses understand and reconstruct the geometry of the environment. Spatial Mapping and Meshing offers both a detailed 3D representation of the environment surrounding the user and a simplified 2D plane representation. Meshing is needed to calculate occlusion masks or running physics interactions between virtual objects with the real world. Snapdragon Spaces developers can access to every element of the mesh (vertex) and request updates to it. Planes are used whenever just a rough understanding of the environment is enough for the application.
For example, when deciding to place an object on top of a table, or a wall. Only the most important planes are returned by the API, since there is a trade-off between how many planes can be extracted and how fast they can be updated with new information from the environment.
In addition to meshes and planes, developers can use collision check to see whether a virtual ray intersects the real world. This is useful for warning the user when they approach a physical boundary – either for safety or for triggering an action from the app.

Why use Spatial Mapping in your AR experiences?

A big part of how well users perceive your app lies in understanding the environment and adapting your application to it. If you strive to achieve higher realism for your digital experiences, Spatial Mapping and Meshing is the right choice. Why? As humans, we can understand the depth of space using visual cues that are spread around us in the real world. Occlusion is one of the most apparent depth cues – it happens when parts or entire objects hide from view, covered by other objects. Light also provides plenty of other natural depth clues – like shadows and glare on objects. Spatial Mapping and Meshing allows to emulate human vision and provide vital information to your augmented reality app to intake depth cues from the real world.

“The Sphere team has been incredibly excited for the Snapdragon Space’s release with Spatial Mapping and Meshing” says Colin Yao, CTO at Sphere. “Sphere is an immersive collaboration solution that offers connected workforce use-cases, training optimization, remote expert assistance, and holographic build planning in one turnkey application. As you can imagine, spatial mesh is a fundamental component of the many ways we use location-based content. By allowing the XR system to understand and interpret the physical environment in which it operates, we can realistically anchor virtual objects and overlays in the real-world” adds Yao. “Lenovo’s ThinkReality A3 smart glasses are amongst the hardware we support, and Sphere’s users that operate on this device will especially benefit from Spatial Mapping and Meshing becoming available.”

How does Snapdragon Spaces engineering team approach Spatial Mapping and Meshing?

Our engineering team’s approach to Spatial Mapping and Meshing is based on two parallel components – frame depth perception and 3D depth fusion. Read on to learn more about each.

Frame Depth Perception We leverage the power of machine learning by training neural networks on a large and diverse set of training data. Additionally, our data set benefits from running efficiently on the powerful Snapdragon® neural processor chips. To scale the diversity and representability of our training datasets, we do not limit the training data to supervised samples (e.g. set of images with measured depth for every pixel). Instead, we leverage unlabelled samples and benefit from self-supervised training schemes. Our machine learning models are extensively optimized to be highly accurate and computationally efficient compared to sensor-based depth observations – all thanks to a hardware-aware model design and implementation.

3D Depth Integration The 3D reconstruction system provides a 3D structure of a scene with a volumetric representation. The volumetric representation divides the scene into a grid of cells (or cubes) of equal size. The cubes in the volumetric representation (we’ll call them samples in this article) stores the signed distance from the sample centre to the closest surface in the scene. This type of 3D structure representation is called the signed distance function (SDF). Free space is represented with positive values that increase with the distance from the nearest surface. Occupied space is represented as samples with a similar but negative value. The actual physical surfaces are represented in the zero-crossings of the sample distance.

The 3D reconstruction system generates the volumetric representation by fusing and integrating depth images into the volumetric reconstruction. For each depth image, the system also requires its pose (camera location and viewing orientation) at the acquisition timestamp, correlated with a global reference coordinate system. The 3D reconstruction system also extracts a 3D mesh representation of the surfaces in the volumetric reconstruction. This is done using a marching cubes algorithm that looks for the zero iso-surface of signed distance values in the grid samples. 3D reconstruction in its entirety is a rather complex and resource-intensive operation. Enabling the mesh of the environment in your XR app brings benefits and feature possibilities that often make it worth the computational cost.

Enabling realism with Spatial Mapping and Meshing

Virtual Object Occlusion Creating augmented reality experiences without occlusion has been common practice for a long time. Having an approximate mesh of the environment can be used to create virtual objects that are occluded by real objects. By leveraging the depth buffer by rendering the environment mesh into the buffer, together with all other virtual objects in your scene, you can effortlessly create occlusion effects. To achieve this effect, you need to create a transparent shader that still writes into the depth buffer.

Virtual lighting and shadows in the real world Similarly, the mesh of the real environment can be used to apply virtual lighting effects to the real world. When creating a virtual light source without a mesh of the environment, only virtual objects will be lit by this virtual light. This can cause some visual discrepancies. When there is a model of the real world that is lit, the discrepancy will be less visible. Shadows again behave quite similarly. Without a model of the environment, shadows coming from virtual objects will not throw a shadow on the real world. This can cause confusion about the objects’ depth and even give users the impression that objects are hovering. To achieve this effect, you need to create a transparent shader that receives lighting and shadows like an opaque shader.

Real lighting and shadows in the virtual world When combining virtual and real-world content it’s often very easy to spot where a virtual object starts and reality ends. If your goal is to have the most realistic augmentations possible you need to combine real and virtual lighting and shadows. You can do this by having light estimation emulate the real-world lighting conditions with a directional light in your scene. Combined with meshing, light estimation allows you to throw shadows from real world objects onto virtual objects.

Limitations Depending on your hardware, there will be certain limitations to the precision of the detected mesh. For example, Lenovo ThinkReality A3 uses a monocular inference-based approach, which can yield certain imprecisions. Transparent objects (such as glass or glossy surfaces) will lead to fragmentations or a hole in the mesh. The further away a real-world object is from the glasses, the less precise the generated mesh for it will be. Quite logical if you think of it – the RGB sensor loses information with distance. The mesh is available for a distance up to 5 meters from the user. Other methods of depth measurement (such as LIDAR) require specialized hardware. There is a cost-effect trade-off. While LIDAR could provide a higher precision for some use cases, RGB cameras are available in most devices already and are more affordable.

Next steps

Refer to Snapdragon Spaces documentation to learn more about Spatial Mapping and Meshing feature and use Unity sample and Unreal sample to leverage Spatial Mapping and Meshing feature.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Snapdragon Spaces at MIT Reality Hack

Snapdragon Spaces Blog

Snapdragon Spaces at MIT Reality Hack

Joining the world’s biggest XR hackathon as one of the key sponsors is a big endeavour that comes with a big responsibility. The team brought a dedicated Snapdragon Spaces™ hack track, two hosted workshops and networking events, showcased demos – to help hackathon participants stretch the limits of their imagination.

FEBRUARY 27, 2023

With five days full of tech workshops, talks, discussions and collaborations, the MIT hackathon brought together thought leaders, brand mentors, students, XR creators and technology lovers who flew from all over the world to participate. The overall event had 450 attendees. There were 350+ hackers from 26 countries placed into 70+ teams; and the Snapdragon Spaces track had 10 teams. All skill levels were represented, and participants had the opportunity to explore the latest technology from Qualcomm Technologies and Snapdragon Spaces.

The Snapdragon Spaces team joined other event key sponsors at the opening day Inspiration Expo. Using the opportunity to connect with participants, we answered questions about the XR developer platform, encouraged hackers to join the hack, and showcased our “AR Expo” demo. The demo uses Snapdragon Spaces SDK features, letting hackers experience a variety of AR use cases – from gaming to media streaming and kitchen assistance apps. Later on, hackers had the opportunity to join Steve Lukas, Director of XR Product Management at Qualcomm Technologies, Inc., and Rogue Fong, Senior Engineer at Qualcomm Technologies, Inc., at the workshop dedicated to developing lightweight AR glasses experiences. This augmented reality-focused session educated participants on the features and capabilities of the Snapdragon Spaces XR Developer Platform.

Closer collaboration with hackers followed at the Snapdragon Spaces hack track, where the team hosted 50 participants in 10 teams in a dedicated area for hacking. The range of project ideas was truly impressive, spanning games, social good projects, and education. While some projects leaned into the functionality of Lenovo ThinkReality A3 and Motorola edge+ phone HW devkit , others integrated additional hardware, such as 3D displays and neurofeedback devices.

Our track rewarded the most innovative, compelling and impactful experience using Snapdragon Spaces, with Stone Soup team taking the first prize for using AR technology to help the unhoused conceptualize, customize and co-create their dream homes. The winning project executed their idea with a high level of polish, utilizing Snapdragon Spaces’ Plane Detection and Hit Detection perception technology, and integrated ESRI maps data into their project.

The runner-up, Skully resented a multipurpose education technology solution that brought up learning modules and 3D models when target images are located. The solution offers students a hands-on approach to learning and allows using Hand Tracking and Gaze Tracking to interact with lesson material. Skully is highly inclusive and offers those unfamiliar with interacting with AR learning modules in the form of more traditional learning formats (such as video).

Worthy of honorable mention is Benvision – the winner of “Working together for inclusion and equality” for enabling the visually impaired to experience the world through a combination of the Snapdragon Spaces 6DoF headset plus a bespoke real-time machine learning algorithm which turns landscapes into soundscapes. Also, Up in the Air team, who won in “Spatial Audio” category, introduced the app that turns a daily working space into a pleasing fantasy-style environment, using Hand Tracking for navigation and Spatial Anchors to anchor a workspace in user’s environment.


MIT Reality Hack 2023 left a long-lasting impression on all the participants. The opportunity to engage with key partners, industry leaders, and a broad XR developer community brought a new perspective into the use and future development of spatial computing. Our team left inspired by unforgettable collaboration, ideation and creativity of the hackers and will be working on incorporating their valuable product feedback in the next SDK releases.

Snapdragon Spaces Developer team in collaboration with Qualcomm Developer Network

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Showcasing XR enterprise solutions at AES 

Snapdragon Spaces Blog

Showcasing XR enterprise solutions at AES

Bringing together the most prominent players in the enterprise space, AES (Augmented Enterprise Summit) took place in October in San Diego. Snapdragon Spaces team used the opportunity to connect with the audience and show advancements of our ecosystem developers.

November, 11, 2022

The event’s theme this year highlighted how selected enterprise players use immersive technologies to enable remote operations, improve training, increase safety, reduce costs, and enhance the customer experience. To kick off the first day of the event, Brian Vogelsang (Sr. Director, XR Product Management, Qualcomm Technologies, Inc.) gave a keynote presentation where he explained how Snapdragon technology brings the best-in-class connectivity across industries, enabling enterprise solutions through a decade long innovation in XR.
“Enterprise AR is starting to accelerate from the innovations that we are helping to build to create lightweight and lower power smart glasses and better form factor. We are seeing enterprise VR also starting to take off and are really excited about the possibilities and use cases” – highlighted Vogelsang.

“With Snapdragon Spaces, we are allowing developers to take the applications developed in VR or MR and bring them over to optical see-through glasses. If you are an ISV or IT company investing in augmented reality experiences, you can start working on AR and bring it to VR device and vice versa. This is a unique thing that has not been done in the industry and we are excited to enable it”.

Proving those points, the Snapdragon Spaces booth showcased four demos of our partners that represented the entire spectrum of XR enterprise solutions – from remote rendering to AR instruction, learning & development, digital collaboration, workflow guidance and remote assistance. Each company has been successfully using Snapdragon Spaces to streamline the development of their applications and bring value to the users.

Here’s how our partners make it easier to create enterprise solutions that meet business needs:

  • holo|one presented Sphere, a mixed reality collaboration and productivity platform for enterprises that takes away the need for expensive customization. Instead, the platform offers seamless integration of turnkey AR into business processes.

  • Holo-Light is on a mission to build a powerful streaming platform for enterprises to leverage immersive experiences, 3D content and scale AR/VR use cases – all while using Lenovo ThinkReality A3 smart glasses.

  • Arvizio featured their AR Instructor – a no-code remote assistance solution that uses the “see-what-I-see” principle to overlay visual instructions directly into the worker’s field of view. The platform allows displaying various forms of content – from video clips to images, 3D models, and even real-time 3D annotations.

  • VictoryXR presented their solution for immersive education and training, equipping users with an X-ray-like vision to study human anatomy in a virtual environment.

The Augmented Enterprise Summit provided a unique opportunity for developers, leaders and innovators across the verticals to discuss the latest trends and developments in the field of extended reality. We are looking forward to integrating the insights gained into the next iterations of Snapdragon Spaces.

Snapdragon Spaces Developer Team

Snapdragon and Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

XR and 5G mmWave hackathon challenge in Tampere

Snapdragon Spaces Blog

XR and 5G mmWave hackathon challenge in Tampere, Finland

For three days, the Finnish city of Tampere became the hub of the university and start-up developers, who took part in the first-ever Snapdragon Spaces hackathon. Participants had the chance to use state-of-art smart glasses powered by Snapdragon to design new use cases and leverage 5G mmWave connectivity and extended reality (XR).

OCTOBER, 24, 2022

Organized by Qualcomm Europe, the City of Tampere, Elisa Networks, Nokia, CGI, and Ultrahack, the hackathon presented a unique opportunity for local developers to compete and make the best out of the future “Europe’s fastest city” 5G mmW rich data infrastructure, including the iconic Nokia arena. Completed with ten independent teams of developers with various profiles, the event set out multiple goals. First, to enable developers to unlock capacities of fast mmW network. Secondly, to let participants apply XR as a new UI medium and use open data sets the city of Tampere has provided — all while using Snapdragon Spaces™ SDK and DevKit consisting of Lenovo ThinkReality A3 smart glasses and Motorola edge+ smartphone. With just two days and no prior experience working with Snapdragon Spaces SDK, all the teams could develop and demonstrate the potential of XR and 5G mmWave capabilities through innovative project ideas.

The hackathon started on Friday with onboarding, an opening session, and focused mentoring. Participants were briefed on the pitch and expected deliverables on the second day. Afterward, sessions interchanged with mentoring, checkpoints, and visiting the Nokia area for the 5G area for mmWave testing. Sunday, the event’s final day, was spent at the arena, where participants had a chance to do the final testing, get the mentor’s input and prepare for the pitch and presentation round. The jury consisted of Steve Lukas, Director of Product Management for Qualcomm Technologies, Inc., Pekka Sundstorm, Nokia Head of Strategy Execution Europe, Teppo Rantanen, Executive Director at the City of Tampere, and Taavi Mattila, Business Lead at Elisa.

To succeed at the final pitch and demo, participating teams were asked to present concepts that will use the Tampere city data set over fast connectivity including 5G mmW, integrate XR, stand out from already existing solutions and bring additional value to the target audience. The teams were also expected to think through the ease of deployment and utilize Snapdragon Spaces SDK and Hardware Developer Kit, apart from other criteria. Two out of three winning teams leveraged user-generated content in their use cases, while all of them utilized open datasets made available by the city of Tampere. Tampere’s 3D “twin-city” model was particularly popular. Teams thought through how they would utilize 5G mmW both for data retrieval and content upload of user-generated and other content at scale in a multi-user, city-wide deployment.

The concept of Tampere xRT revolved around city location-based art installation and platform. The team delivered an impressive combination of pitch, idea, and implementation. Whispers of Tampere presented the idea of sneaking audio snippets into the city park, allowing people to interact with the city and each other. Audio messages can be placed at any city location and then discovered and retrieved by others. The third winner, AR Digital Spaces presented an idea of dynamic AR showrooms with an immersive ad-based concept. The jury was impressed with an effective combination of various technologies in one app that promised to make ads more engaging and purposeful. Another two participating teams received honorary mentions: ARenify for their promise to transform sports experiences and Public Commuter for the concept aimed to make the city transport network more accessible and inclusive.

All the winning concepts involved integration with XR, cloud-hosted UGC, fast connectivity including 5G mmW for at-scale, city-wide use, as well as audio recording features on Android, Azure Kinect, and web-based portals for managing content. The first of the many Snapdragon Spaces hackathons to come, the event has provided an immense learning experience for all parties in co-designing for the upcoming Metaverse era. The City of Tampere is already exploring the possibility of leveraging some of the ideas and teams as part of many smart city activities. At the same time, the project participants plan to apply for European funding to help expand the 5G mmWave network outside the arena and explore bringing hackathon ideas to the citizens of Tampere.

Snapdragon Spaces Developer team in collaboration with Qualcomm Europe

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

T-Mobile accelerator x Snapdragon Spaces program launch

Snapdragon Spaces Blog

T-Mobile accelerator x Snapdragon Spaces program launch

Startups, developers and their mentors from T-Mobile and Qualcomm Technologies gathered for a day full of insights, exchange, and knowledge transfer.

October 1, 2022

Following the announcement earlier this year, T-Mobile Accelerator kicked off to enable select developers and startups to create 5G headworn AR experiences and leverage Snapdragon Spaces™ XR Developer Platform. Gathering support of experts from T-Mobile, Deutsche Telekom and Qualcomm Technologies, six accelerator participants spent weeks building immersive consumer AR experiences for smart glasses across gaming, entertainment, education, travel and other industries.

The event started with a welcome coffee and opening speeches from Scott Jacka (T-Mobile Accelerator), Kerry Baker (Tech Experience), and Kathy Braegger (Qualcomm Technologies, Inc.). Participants then were invited to join deep dive sessions where Steve Lukas (Qualcomm Technologies, Inc.) guided them through the platform and answered questions.

After the lunch break, deep dives continued with Grace Hsu (Mixed Reality, Microsoft) and Finn Sinclair providing an overview of MRKT3 and hand tracking.

Entrepreneurs and developers then had a chance to meet their mentors face to face and introduce their companies and AR concepts to the tech leads. Among the participants:

We are excited to welcome new developers into the Snapdragon Spaces ecosystem and work together with our partners at T-Mobile US and Deutsche Telekom to support and facilitate the creation of next-generation solutions.

Snapdragon Spaces Developer Team

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Expanding Snapdragon Spaces platform support to include MR and VR

Snapdragon Spaces Blog

Expanding Snapdragon Spaces platform support to include MR and VR

SEPTEMBER 27, 2022

When we launched Snapdragon Spaces at AWE in June 2022, we set out to create a platform and ecosystem for headworn augmented reality that empowers developers to create immersive AR experiences built on open standards. Inspired by the excitement and innovation we’ve seen from you all building with Snapdragon Spaces worldwide, we are proud to be working with this innovative community shaping the future of the medium. While it’s still very early in the journey to a world where spatial computing is interwoven in our daily lives, the pace is accelerating each day, and Qualcomm is more committed than ever to making this our collective reality.

Snapdragon Spaces is evolving quickly, with 8 releases thus far in 2022 we are bringing new features every six weeks like Hand Tracking, persistent Local Anchors, and Hit Testing. We really value the feedback from the developer community, and want to hear your ideas – so please continue to share your thoughts and upvote features on the roadmap. We have also launched a Discord server for the community to connect and share. Feel free to stop by — we would love to learn more about you and your work in XR.

While we started our work with Snapdragon Spaces in augmented reality, our team knew that XR was evolving rapidly. And this means that virtual and augmented reality devices will begin to converge with the addition of video passthrough, or mixed reality (MR) features coming to VR headsets. Mixed reality enables experiences that blend augmented and virtual reality by adding stereo color cameras to a virtual reality headset and passing the video from those cameras through to the VR displays. This is a difficult problem to solve, and we have been working for multiple years optimizing camera and video image processing pipelines in Snapdragon-based XR Platforms to enable the highest quality and lowest latency mixed reality experiences.

For developers, mixed reality means that they will be able to create experiences on VR headsets that are much more like the AR experiences on optical see-through devices such as the Lenovo ThinkReality A3 AR glasses in the Snapdragon Spaces Dev Kit. In fact, we believe that developers should be able to have portability between the two. We are excited to share that Snapdragon Spaces will be adding support for mixed and virtual reality in early 2023. Our vision is to allow you as a developer to build mixed reality experiences on Snapdragon Spaces for VR/MR headsets and bring the same application to AR glasses or vice versa.

This will enable developers building Snapdragon Spaces experiences on augmented reality glasses to expand the number of devices they can reach with their app to include VR headsets with mixed reality. It will also allow VR developers who start experimenting with mixed reality to run the same application on augmented reality glasses. We believe this is an industry first and we are excited to be enabling this capability for Snapdragon Spaces developers soon.

Look out for more details about the VR/MR headsets we will be supporting with Snapdragon Spaces in the coming months, as well as information about how and when you can get started. To stay informed (if you are not already subscribed), please join our mailing list and Discord server. Thank you for your enthusiasm for XR, support for Snapdragon Spaces, and commitment to supporting open platforms and ecosystems in spatial computing.

Cheers,
Brian Vogelsang
Sr. Director, XR Product Management, Qualcomm Technologies, Inc.

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.

Categories
Gaming

Introducing Hand Tracking 

Snapdragon Spaces Blog

Introducing Hand Tracking

The Hand Tracking and Gesture Recognition feature, in the latest Snapdragon Spaces SDK 0.6.1 release, offers a natural method for user interaction to power advanced headworn AR experiences.

AUGUST 3, 2022

Hand Tracking and Gesture Recognition can introduce natural movement to human-computer interaction, greatly enriching the augmented reality experience for end users. Thanks to existing cameras on AR glasses, hand and finger positions are detected, recognized, and tracked in 3D space where they can then interact with AR content. The ability to use hands to interact directly with digital objects creates an intuitive experience and allows for deeper user engagement, removing the need for less intuitive controllers.

An early version of this feature is available via the Snapdragon Spaces Unity SDK. Developers can now incorporate Hand Tracking and gestures directly into their AR applications.

Hand Tracking: where and when to use it?

As we move from the head down to the heads-up paradigm when using digital content, it’s important to consider user comfort, utility, and usability as a cornerstone when designing augmented reality apps. Hand Tracking provides natural interactions that appeal to many users, making it a flexible component of truly immersive experiences.

Why is Hand Tracking so well received by users? The answer is simple: seeing your own hands takes the cognitive load off the brain, helping you locate your position in space. This, in turn, leads to a decreased learning curve and allows you to interact with both digital and physical surroundings in a way that feels natural.

“We envision a future where AR glasses are ubiquitous and people carry them around easily, without the need for motion controllers. For that to happen, the control mechanics need to feel natural and pleasant to use” – says Javier Davalos, founder at OverlayMR and lead at Figmin XR.

“The Figmin XR team has been working on this challenge for a long time, and we are happy to report that we have succeeded in our quest. For us, implementing hand tracking was a huge challenge, not only did we have to design the control mechanics from scratch, but also drive all the functionality from the tracking points that the system provides. Eventually, the effort was totally worth it.”

“Snapdragon Spaces developers will have a much easier time since much of this will be provided by the SDK. Controlling Figmin XR with just your hands feels magical and completely natural, so much so that at this point we prefer using it over motion controllers.”

Use Cases

Hand tracking and gesture recognition is universally applicable in a multitude of use cases, from industrial settings to gaming. Some examples include:

  • Simplifying interfaces and workflow navigation in industrial settings and for enterprises
  • Enhancing AR tabletop gaming through natural interactions with nearby virtual objects
  • Boosting virtual collaboration when interacting with 3D models and prototypes in engineering, design, AEC and sales
  • Driving engagement and conversion rates in E-commerce and showrooms (think digital try-ons)
  • Lowering the interaction barrier while improving the service quality in healthcare and telehealth

Best Practices

Now, that you are more familiar with the benefits of Snapdragon Spaces Hand Tracking, where can you learn more to start implementing the feature for your AR app? Head over to our documentation to find:

Ready to implement Hand Tracking using Unity engine?

Refer to our Hand Tracking Integration Guide and use Basic and Extended samples to fully leverage Hand Tracking components inside the Snapdragon Spaces plugin.

From gaming to enterprise and beyond, Hand Tracking and Gesture Recognition are powering some of the most engaging AR experiences. Being a truly versatile feature, Hand Tracking allows AR developers to easily create new and memorable user experiences.

Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.