Mixed reality for architecture and real Estate by MSM.digital and mixed.world
MSM.digital and mixed.world combine design, branding and technology to take customer engagement to new levels. They collaborate on large-scale projects, including experiential marketing, AR/VR, training and content creation in architecture and real estate.
When the design and branding expertise at MSM.digital meets the multi-user experience at mixed.world, customer engagement moves to a new level. Together, the companies have developed Virtual Places, a mixed-reality experience that displays a 3D map on a table to users in different locations. In each location, people wearing AR glasses share the experience of touring, discussing and interacting with the map in real time.
Super-powers for architects and real estate agents
A high-profile application of Virtual Places is the companies’ joint Wilhelm Project, which uses Snapdragon Spaces™ XR Developer Platform, the Microsoft Bing Maps SDK and highly detailed maps of downtown Berlin, Germany. Virtual Spaces makes it possible to see and tour buildings in 3D – months before they have been built.
“We received the computer-aided design (CAD) files of the building plans from the architects,” says Fabian Quosdorf, Managing Director of mixed.world. “Then, together with MSM.digital, we created a real-time, 3D representation and integrated it to the map at the exact GPS location and position the building will occupy. With all the context and surroundings, Virtual Places offers a much better impression of how the building will eventually look. Plus, it gives super-powers to architects and real estate agents because prospective occupants and investors can not only see the exterior but also tour the interior interactively.”
In the Wilhelm Project, users can inspect specific aspects of the building, story by story, in a transparent, holographic view. The virtual tour includes a penthouse apartment on the seventh floor, with the furnishings, environment and details that make it interesting and special. “Virtual Places adds a new layer of interactivity and even creativity to the traditional sales process,” says Quosdorf. “Imagine multiple projects besides Wilhelm, elsewhere in the world, all integrated into a single experience. Architects and real estate agents could use it for multi-user presentations of their virtual portfolio.”
Creating a real-world tour of a building using Snapdragon Spaces
mixed.world uses Dual Render Fusion feature as the foundation of all of the applications they build on Snapdragon Spaces. That allows users wearing connected AR glasses to see the spatial map in real time in full 3D. At the same time, users navigate the world they see in the glasses with typical gestures – like multi-touch, pinch-and-zoom and rotation with two fingers – on a tethered, device powered by Snapdragon® technology.
To improve the multi-user experience, mixed.world uses Local Spatial Anchors so everyone sees the same content anchored at the same position and location. The company also built a cloud-based version of those anchors to allow devices in different locations anchor to the same virtual table at the same position. With Camera Frame Access, developer company has enabled image-based cube tracking, which uses a paper cube that customers can set on a table later and track with the glasses, extending customer engagement.
“For user interactions, we use the phone screen as a Companion Controller, with full Dual Render Fusion. When users want to move between floors, we give them the option of switching from using the phone as a controller to Hand Tracking,” says Quosdorf.
Snapdragon Spaces Dual Render Fusion allows mixed.world to render on the phone any content that users would normally see in a mobile app. And, they can stay entirely in the Unity environment, working with designers on engaging user experiences directly for the phone screen.
Contributing to the developer community
MSM.digital and mixed.world found new possibilities in Snapdragon Spaces, took advantage of them and then contributed the resulting features, Ultra-Sync and View-Port, to the developer community. “Ultra-Sync is the shared user experience at the heart of Virtual Places,” says Camillo Stark, Deputy Managing Director of MSM.digital. “It allows developers to incorporate multi-user sharing locally and remotely, with excellent performance. This is its debut in a Snapdragon demo.”
Another innovative feature, View-Port, gives an ability to view shared virtual experience from an external camera and film it live. “Developers often have to work in the alter-ego perspective,” says Stark. “Headset cameras not always have great resolution, so the virtual content doesn’t look good when you’re inside it. Or, if you move your head too fast for the camera to follow, the content appears distorted and you lose a lot of the story. View-Port is the outside point of view you need to see and understand the content. It’s a supplemental feature one can easily download onto smartphone, then film the experience and capture it for events and showcases.” The two companies have made Ultra-Sync and View-Port available to any Snapdragon Spaces developer to adopt.
MR experience highlights:
- Options for user interaction. Users can switch between Dual Render Fusion with the tethered device and Hand Tracking with gestures.
- Giving back to the developer community. MSM.digital and mixed.world have contributed innovative features that deepen the shared, virtual experience for users.
- Enabling virtual, multi-user tours. Virtual Places facilitates business integration and shared user engagement beyond geographical barriers.
“Wilhelm Project experience build on Snapdragon Spaces offers much more than traditional marketing. We sent AR glasses to potential customers in other countries and the salesperson in Germany can give them a remote real estate tour. This use of technology builds excitement and make the sales process portable.”
– Camillo Stark, Deputy Managing Director, MSM.digital
“The work with MSM.digital and their partners was outstanding and the opportunities that such an immersive solution provides is truly untapped.”
– Matteo Twerenbold, Managing Director, Adler Group
AI-based fitness AR routine with Litesport
Using Snapdragon Spaces™ XR Developer Platform capabilities, Litesport created an AI-based fitness application demo for Snapdragon Summit. The demo combined augmented reality, biometric feedback, and AI pose detection algorithms powered by ASENSEI. The cross-device AR experience takes advantage of a smartphone, a smartwatch and AR glasses ecosystem, all powered by Snapdragon® chipsets.
Litesport develops fitness applications that use virtual reality, mixed reality and augmented reality to enhance the way people exercise and stay active. Their Litesport application combines virtual reality with real workout modalities like Strength Training, Bootcamp and Boxing. The company’s immersive, interactive and personalized approach to fitness consistently merges the physical and virtual worlds.
Building a cross-device fitness app
Litesport saw the opportunity to create an artificial intelligence (AI) based fitness routine that would capitalize on the advantages of a smartphone, a smartwatch and AR glasses, leveraging the full ecosystem of devices powered by Snapdragon.
As members of the Snapdragon Spaces Pathfinder program, they worked with Qualcomm Technologies, Inc. to configure a cross-device solution powered by the Snapdragon family of mobile processors:
- The smartphone is a reference device built on the Snapdragon 8 Gen 3 Mobile Platform. The phone runs software from ASENSEI that processes AI-based body pose data and provides real-time feedback and corrective actions.
- The smartwatch, powered by the Snapdragon W5+ Gen 1 Wearable Platform, gathers the user’s biometric data in real time and transmits it to the smartphone for consolidation.
- The wireless AR glasses run the Snapdragon AR2 Gen 1 Platform, adding the visual dimension of a personal trainer guiding the session.
- Hand Tracking for natural interaction. Experience uses Hand Tracking feature to allow users navigate UX and menu.
- Plane Detection and Positional tracking for experience grounded in the real environment. The use of Plane Detection feature allows to realistically anchor virtual trainer to the floor, while Positional Tracking helps to fix the trainer and UI in the place relative to the world around you.
- Dual Render Fusion allowed to use multiple displays and run two different UI flows on the phone screen and the glasses simultaneously.
The result is an immersive guided AR workout, where the user does not just passively watch content, but gets real-time feedback on their technique — just like in a real-life training session. The AI algorithm behind the experience leverages the smartphone camera (pointed at the user) to recognize person’s movements and show it back to the user during the session, enabling real-time contextual trainer feedback. As a result, users can ensure their form is correct and maximize their workout. Thanks to the connected smartwatch, the application also has access to biometric data, providing a comprehensive view of workout performance as well as personalized reminders and workout suggestions.
Developing on Snapdragon Spaces
To build as much interactivity as possible into their AR application, Litesport worked with the Snapdragon Spaces XR Developer Platform. The platform not only offers a stack of perception features that help developers create XR experiences, but also provides a single SDK with the functions and APIs needed for AR glasses. Litesport designed and built their AR application on the wired Lenovo ThinkReality A3. Then, because Snapdragon Spaces unifies development on AR glasses, they were easily able to conduct their demo at the Snapdragon Summit using wireless based reference-design glasses that support Snapdragon Spaces and Snapdragon AR2 technology. With writing limited code, Litesport was pleased to find that the reference-design glasses supported their app as they had designed it for the A3.
“The fact that Snapdragon enables experiences built across devices, is huge”, says Jeffrey Morin, CEO and co-founder of Litesport. “It’s going to allow us to deploy on any new device that will come out in the future. With processing happening on the phone, devices can become increasingly lightweight, which is crucial for connected fitness.”
“By getting accurate form tracking and correction through the Asensei software on the phone, we were able to create a ‘choose your own adventure’ style experience, where the trainer gave different coaching tips based on what you did live,” adds Morin.
“For example, encouraging you to go deeper in your squat, or being mindful of leaning too far in one direction while doing lunges. This creates something more immersive, personalized and helpful to the user in terms of an AR fitness experience, like live 1:1 personal coaching, rather than just watching a single, standard, pre-canned workout. The AR glasses provide an unparalleled, real and clear representation of your personal space, and so the experience of seeing a live trainer in front of you is more magical, immersive and impressive, not to mention the form factor is much comfortable and practical for working out.”
To promote cross-device experiences like Litesport, Snapdragon Spaces has begun working on a new initiative called Snapdragon Seamless™, combining the respective technical advantages of smartphones, AR glasses and wearables.
AR experience highlights:
“In less than six months, we were able to build a functional demo that delighted the audience of Snapdragon Summit. The experience is designed to showcase the next-gen chip of the device that will be released. We worked with a new partner, ASENSEI and were able to put all components together on a platform we never used before thanks to the support we got from Snapdragon Spaces team.”
– Jeffrey Morin, CEO and co-founder of Litesport
Workplace soft skills XR training with Talespin
Talespin created a spatial computing platform for talent development and skills alignment for the future of work. The company’s no-code content authoring tool, CoPilot Designer, enables anyone to create VR training content that simulates conversational role play with virtual humans to build soft skills. Talespin accelerates learning, makes learners more confident and increases knowledge retention.
Talespin’s proprietary platform offers an enterprise solution for creating immersive learning content, distributing and measuring its effectiveness. With a view to the future of work, Talespin enables a personalised, immersive approach to learning so people can explore career paths and learn critical workplace skills that are essential to both businesses and workers. The company’s products include a content authoring tool powered by generative AI, a vast library of ready-made soft skill VR training content for enterprises, a consumer XR learning app, and a skills analytics dashboard for measuring training progress in real time.
Immersive learning around soft skills
Talespin has studied the way that companies train people to work, and how that has changed. Remote and hybrid workforces are here to stay, but there are many training needs that have traditionally been addressed face-to-face, whether in the office or by traveling. As the workplace depends less on in-person transactions, companies turn to spatial computing to handle training events and modes remotely.
Talespin was founded with a focus on AI and automation. Its principals began to see that automation can perform task-based work, but it can’t communicate ideas, offer constructive criticism or be a team player. As companies automate more, the need for certain workforce skills, such as building rapport, resolving conflict and negotiating compromise, becomes more acute. Talespin thought that one way to train people at the scale needed today was through immersive, interactive training with a foundation in virtual technology.
Understanding different viewpoints
“Talespin wraps powerful learning paradigm around the biggest skills in need,” says Kyle Jackson, CEO of Talespin. “Building virtual reality into training requires us to put the power of creation into the hands of the business, but nobody has been doing that. VR has been in the hands of skilled practitioners who aren’t familiar with the challenges of the business. So our platform provides VR and content tools to businesses so they can manage their own training needs.”
Naturally, it’s not easy to train people on emotional intelligence and soft skills virtually in an immersive setting. “We aim to address this,” says Jackson, “with a series of learning events centered around role playing that leads to a shift in mindset.
For example, if you want to train an employee in communication and collaboration, you need to put yourself in the other person’s shoes, which is hard to accomplish. But the essence of soft skills is that you’re trying to close the divide between viewpoints. People understand things differently, and our tools help companies educate their workers on how to help close the gap between different viewpoints.”
Talespin’s approach appeals to enterprise customers who understand that the difference in viewpoints requires better skills in finding common ground. But once employees get a handle on those soft skills, they can communicate through problems more effectively.
No-code content creation tool for VR training
Talespin built its platform around the requirements of standard tools used in the learning industry. It added a “world-building mindset,” in which learning designers answer questions like “How do I display this?”, “What am I simulating?”, “What’s the space?” and “How do I represent that?” Those concepts have existed in film, television and gaming for a long time, but not in broadly distributed, enterprise-led functions. Jackson sees most of the upskilling in that mindset, not in the ability to use a software program.
“Our tools are simple,” says Jackson. “We’ve aimed them at learning designers and people who are familiar with their business, but not with 3D, development, coding, animation or extended reality. We’ve pushed those tools into the background and automated them. Because we provide the foundational systems, learning designers can simply describe the problem they’re trying to solve. They know the context of the business, the world it exists in and their ideal outcome. And our platform helps them build a training module. It will cast the virtual humans, write their dialogue and offer decision choices. It will build that around the world they know, so they can review it and publish it.”
Using Snapdragon Spaces for VR
Talespin had followed Qualcomm Technologies’ progress in publishing reference designs of AR and VR hardware. When Snapdragon Spaces™ XR Developer Platform launched, Talespin investigated it as a way of standardizing and simplifying the development of content and platforms in spatial computing. Simultaneously, Qualcomm Technologies surveyed the landscape and identified Talespin as a valuable player. Building their experience with Snapdragon Spaces SDK, the company experiment with Image Tracking, Local Spatial Anchors and Spatial Meshing features.
As more hardware providers emerge, enterprises ask for more types of devices. That means more hardware for platform builders and developers to support. To Talespin, the value of Snapdragon Spaces is that it helps them address that trend by easily supporting hardware like the Lenovo ThinkReality VRX .
VR experience highlights:
- Enabling virtual training in soft skills. Helping company employees to effectively resolve conflict, communication and collaboration.
- Developing VR training content. Talespin’s platform makes it easy for anyone to develop interactive, VR training for critical workforce skills.
- Focusing on content development instead of technology. Talespin focuses on greatly reducing the skills and resources needed to create XR content, making it possible for anyone to rapidly create XR thanks to no-code and AI-powered tools.
“Our engineering team found that there’s a strong, collaborative community around Snapdragon Spaces. They were able to quickly evaluate the cost-benefit trade-offs and decide which features to support.”
– Kyle Jackson, CEO, Talespin Reality Labs
Increasing the Efficiency of Enterprise Training with Scope AR
Scope AR is dedicated to making anyone an instant expert. WorkLink, its augmented reality knowledge platform for work instructions and remote assistance, addresses critical business needs by significantly reducing training and ramp time, error rates and rework. To make enterprise technology training more efficient and effective, Scope AR sees the solution in augmented reality and its potential for better retention through better interactivity.
(c) Scope AR
Scope AR has created WorkLink, an end-to-end content authoring platform for increasing the efficiency of enterprise-level technology training. WorkLink allows users to easily drag and drop 3D models of objects like machines into a web browser, then annotate them with instructions and complex animations. The content is published to end user devices, instantly sending virtual, context-specific guidance to hundreds or thousands of workers in manufacturing, maintenance, and field service.
Delivering information into the workforce more efficiently
Mobile devices and processing power have made traditional modes of training – rote memorization, classroom sessions, study guides – less appealing and less effective than AR-driven, on-the-job training. With a pair of AR glasses, a trainee can interactively learn through visualization whatever would appear in a printed manual, and experience it real time. Examples includes how to open a machine, where to hold a tool, what to avoid, how to troubleshoot error codes and how to test after reassembly.
With WorkLink, Scope AR addresses two main approaches to enterprise training. The first is on-device training, in which trainees wearing AR glasses work on a piece of real-world equipment on a surface or workbench right in front of them. WorkLink overlays 3D models of virtual parts on the equipment so that trainees follow, step by step, how to perform a given task (such as disassembly, removal or repair). The second approach is virtual training, in which trainees use AR glasses to see and interact with a digital model of the equipment, displayed in virtual space. In addition to the advantages for service and maintenance use cases, the virtual approach is valuable in sales training. Companies can emphasize and conduct in-depth training for sales teams on a digital representation of expensive, fragile or inaccessible equipment. In all approaches, the trainees see animations, text, arrows, images and other use-case relevant indicators to guide them. The training content appears in the AR glasses as if the equipment was annotated in mid-air.
Unifying device adoption with Snapdragon Spaces
“Devices are the biggest barrier to entry in AR,” says Scott Montgomerie, founder and CEO of Scope AR. “We’ve been at this for more than ten years, and we’re always looking for good hardware and a good user experience. Snapdragon Spaces allows us to write to a single SDK that unifies all the APIs and provides everything we need for a robust pair of AR glasses. For us that’s really exciting.” The user experience on the device is where most of the work is visible. Montgomerie says that’s just the tip of the iceberg. Below the water are elements like data management, content authoring, revision management, scalability, encryption and user management. Snapdragon Spaces makes it easier to enable these critical enterprise grade standards. “Once we were convinced to go with Snapdragon Spaces, the effort required to port the application was pretty light,” says Montgomerie. “We had to upgrade Unity for compliance with OpenXR and AR Foundation. But compared to the amount of work we’d previously put in to support and maintain new hardware, it was very pleasant. And the fact that we already supported standards like Mixed Reality Toolkit meant it was fairly easy to port.”
How enterprise AR is evolving
WorkLink incorporates most of the features in the Snapdragon Spaces Extended Reality SDK, with particular emphasis on these for enterprise technology training:
- Plane Detection When users work with a virtual model of a piece of equipment, they want it resting on a surface, not floating in mid-air so the feature helps anchor the content on a physical surface.
- Dual Render Fusion WorkLink takes advantage of the processing power in a tethered mobile device. That makes for deeper interaction and a better training experience in the virtual space.
- Hand Tracking Helps users interact with the instructions or virtual machine parts, using natural gestures.
- Positional Tracking WorkLink tracks the user’s position so it can render training content in the scene relative to user location, head position, and orientation.
Scope AR believes that Snapdragon Spaces gives hardware manufacturers the chance to focus on what they’re good at: building innovative, differentiated hardware and new choices for AR glasses. And, Qualcomm’s platform will allow manufacturers to do that without imposing more engineering overhead on the AR development community. Montgomerie sees enterprise AR evolving to span three main use cases: field service, manufacturing and learning. Each has its own ideal device parameters like size, weight, ruggedness, battery life and computing power.
“The mobile devices you’re tethered to keep getting more powerful and more differentiated,” says Montgomerie. “We find that customers value the ability to buy the glasses once and continuously upgrade the phone. That’s a big opportunity for us to write to the Snapdragon Spaces SDK and have many different options that cater to different use cases and hardware.”
AR experience highlights:
- Enterprise training, field service and manufacturing. Scope AR delivers knowledge to enterprises through augmented reality.
- On-device and virtual training. The WorkLink content development platform is designed to display instructions through indicators like animations, text, arrows and images on real-world machines and in virtual spaces.
- New, more engaging model of enterprise training. Augmented reality supports Scope AR’s vision of better, faster training to hundreds or thousands of users, with better engagement and knowledge retention.
“Snapdragon Spaces offers a uniform API. When we build on it, a potentially large range of AR devices will run out of the box. The platform removes a huge burden of adoption and ongoing maintenance from our engineering teams. In addition, Snapdragon Spaces includes many pre-made widgets and features that we’d otherwise have had to code ourselves.”
– Scott Montgomerie, Founder and CEO of Scope AR
Game development across VR and AR with Survios
Survios produces and publishes engaging, multi-platform games across popular, licensed franchises and its own original properties. The company is expanding its game development from VR into AR, bringing more features and deeper engagement to its interactive experiences.
(c) Puzzle Bobble Tech Demo
Originally a hardware company, Survios pivoted to software and VR then developed Raw Data which became hugely popular on Steam. They redoubled their efforts in VR with innovations like sprint vectors and fluid locomotion to improve users’ control over movement. The company has worked with a wide range of fan-favorite intellectual property including such franchises as The Walking Dead, Westworld, Creed and Rocky.
Expanding into the new frontier of AR
In looking at technology trends and user preferences, Survios began to explore incorporating AR to their game development, with all the questions that would entail. “First of all, in AR, there’s no lead technology, hardware-wise,” says Mike Domaguing, Senior Vice President of Partnerships and Publishing at Survios. “There are hopes and dreams, but until developers have hardware in their hands, realizing those hopes and dreams is hard. Then, there are gameplay considerations with AR. Players want to be inside the game physically, but because of hardware limitations, we have to give them the ‘looking-down’ perspective on the game. Finally, there’s the question of where AR is going. Will it be PC- or mobile-oriented? That’s as much a business decision as a technical decision.” In short, Survios felt that the future of AR, especially in gaming, is very bright – although not without challenges.
Taking Snapdragon Spaces into AR game development
The company’s collaboration with Qualcomm Technologies became a strong source of encouragement for their AR initiative. Survios began working with the Snapdragon Spaces™ Extended Reality SDK and decided to use it for Puzzle Bobble 3D: Vacation Odyssey. They have built a compelling game experience around the tech stack of Snapdragon Spaces, OpenXR and Unreal Engine. “In developing Puzzle Bobble for AR, we used several features of Snapdragon Spaces,” says Alex Silkin, Chief Technology Officer and co-founder of Survios. “That includes Plane Detection and Hit Testing for locating a suitable surface to anchor the play area. Then we use Local Spatial Anchors to attach the play area to the real-world surface and maintain spatial consistency.”
For user interaction, the game takes advantage of Hand Tracking feature so that players can manually position the play area where they want it. “Players can touch bubbles both in the menu and within the game to make selections,” says Silkin. “That makes for intuitive, engaging user experience. We used Pinch gesture detection so players can confirm their choices with a simple hand gesture. Then, by pressing on the touch screen, players make the character Bob throw bubbles in the game. For that we used the Companion Controller Android application as a way to press buttons quickly and easily.”
Making the most of XR development
Survios has these tips for developers who are new to XR:
- Consider the Companion Controller as an additional form of input and output that is unique to Android. Survios experimented only briefly with using the phone as an input device but believes there is ample opportunity for using the phone display as an additional user interface.
- Use player’s head direction to aim character orientation. The studio found it as intuitive as Hand Tracking and less cumbersome than obliging players to have their hands up in the playing area all the time.
- Have a way – other than installing on the hardware – to quickly develop and test your core application mechanics. An example is to test core functions on a simple PC with a mouse and keyboard.
AR experience highlights:
- Developing more compelling games. Survios sees AR as an avenue for deeper engagement and better storytelling.
- Making the transition into AR. Survios is keen to extend its technical reach to more areas of extended reality.
- Enabling content for more devices Snapdragon Spaces is Survios path to reducing friction in going between AR and VR and bringing more enjoyment to players.
“We enjoy developing in Snapdragon Spaces because Qualcomm Technologies is committed to innovation and technology. They listen to developers, deliver features that developers want and give developers a good tour of those features. We work on a future project to enable creators to bring amazing experiences to fruition – that’s why our work with Snapdragon Spaces is important to us.”
– Mike Domaguing, SVP Partnerships and Publishing, Survios
Democratized XR and digital twins for frontline workers by DataMesh
DataMesh’s Director is an enterprise platform for creating digital twin content. Using the Snapdragon Spaces™ XR Developer Platform, DataMesh takes its digital-twin applications to a variety of AR glasses for training, guidance and operational planning in industry.
To bridge the gap between the digital and real worlds, DataMesh builds products that democratize XR and digital twins. By making those technologies accessible, the company empowers front-line workers to improve their capabilities, acquire knowledge and easily develop critical skills.
A platform for XR and digital twins
DataMesh offers FactVerse platform designed for building Training, Experience, Monitor and Control, and Simulation (TEMS) scenarios around XR. FactVerse includes applications such as DataMesh Director, DataMesh Inspector and DataMesh Checklist that support various platforms like HoloLens, Android and PC.
DataMesh Director enables companies to quickly convert existing 3D assets (e.g.CAD/BIM files) into 3D digital twin content. Users of DataMesh Director can create standard operating procedures, support guides and 3D product manuals, and monitor robots and equipment in real time. One enterprise used DataMesh Director to create mixed reality manuals and added AI for a 25-percent boost in customer service productivity and remote access to product demos. Another enterprise customer used DataMesh Director to import building information modeling (BIM) data and visualize it in 3D. The result was a digital twin that mapped construction sites accurately.
A big part of delivering satisfactory XR lies in the software development process. That includes optimizing 3D models, particularly in industrial applications like manufacturing and construction. Few XR devices can render those large-scale models smoothly and efficiently, so careful optimization is required. Plus, those devices depend on integration with software modules for interactions like sound, gesture controls and click-based events.
Snapdragon Spaces and the Pathfinder program
DataMesh was drawn to the Snapdragon Spaces XR Developer Platform to improve its offering in XR rendering and interactive performance. The team was also impressed by the well-developed Snapdragon Spaces ecosystem, which encompasses developers, enterprises and channel partners and provides a network of resources to support their XR initiatives. The company takes advantage of the compatibility of Snapdragon Spaces with Unity, enabling their developers to push the boundaries of augmented reality and easily share their creations. The company has combined the mature functionality of the DataMesh platform with Snapdragon Spaces features like Hand Tracking and using a mobile device as a controller. DataMesh Director, when integrated with the Snapdragon Spaces SDK, adapts to the user’s environment and creates an immersive work session. As a result, users can create interactive digital-twin content and make AR experiences more realistic and responsive.
Snapdragon Spaces also solves a problem DataMesh has with headset adaptation. Many headset SDKs have different requirements for specific Unity versions and platforms, prolonging DataMesh’s development cycle and increasing the cost of maintenance. When DataMesh uses the Snapdragon Spaces SDK to adapt its products to XR devices, they eliminate the need to work separately with each headset’s SDK, saving time and money. The Snapdragon Spaces SDK has allowed DataMesh to expand the availability of their software to other XR devices, including the Lenovo ThinkReality A3. DataMesh’s collaboration with and participation in the Pathfinder Program have given the company access to invaluable support and technical expertise, significantly accelerating its development process.
AR experience highlights:
- Democratizing digital twins. The DataMesh FactVerse platform makes digital twins more accessible to frontline workers and creators, addressing workflow challenges in training, planning and everyday operations. DataMesh empowers ordinary people to use digital twins conveniently across a wide range of devices.
- Adapting to more XR devices. DataMesh is bringing digital twin applications to more AR glasses, supporting industrial applications while requiring minimal 3D and programming knowledge.
- Converting CAD assets into digital twin content. DataMesh Director enables companies to quickly convert existing assets like CAD/BIM files into 3D digital twin content.
“Our mission to democratize digital twins and XR for empowering frontline workers and driving enterprise innovation has been greatly supported through our collaboration with Snapdragon Spaces.
Their robust XR development platform is a perfect fit for our needs, offering exceptional reliability and stability for deploying our app across multiple platforms. We highly recommend Snapdragon Spaces to any enterprise seeking to create or enhance their XR solutions.”
– Hao Wu, CTO, DataMesh
Virtual reality workforce training by Uptale
Uptale’s cloud platform enables enterprises to digitize field training and improve workforce efficiency using VR. Whether beginners or experts in VR, users can create and deliver their company’s immersive learning experiences in VR, then deploy them at scale and analyze the results.
The Uptale VR platform includes authoring tools that enterprises use to develop immersive training experiences from 360-degree videos. Content creators build out workforce training with a variety of interactions, such as quizzes, complex exercises, information panels and videos. They can add advanced features like voice recognition and interactive, 3D models that trainees can manipulate.
A large portion of Uptale’s customers use the platform for in-field instruction, such as safety training, standard operating procedures, quality field visits, sales training and even soft skills. One customer, a global automobile manufacturer, uses Uptale to digitize their standard operating procedures on the production line. Their instructors can train more effectively without having to be present on the plant floor, and trainees can avoid making mistakes or slowing down the line. Over the years, Uptale solution has proven that the adoption of XR technology can provide significant benefits for more than 200 companies such as Stellantis and Alstom. A training school manager at Stellantis declared that her teams successfully reduced training time by 2 and significantly increased their efficiency using VR.
Always having to keep up
“This is a good time to be in XR,” says Sebastien Leang, co‑founder and CTO of Uptale. “Customers in both education and the enterprise see the efficiency and ROI of XR training, which means that the market is maturing. We spend less time now convincing people of the value proposition.” Uptale does more than help its customers catch up with the XR market; it helps them keep up with their ever-changing training needs. “It’s important for our customers to keep their training up to date,” adds Sebastien. “When their procedures change or a new rule takes effect, customers don’t want to have to spend months re-developing their training modules. They want to be agile in creating and updating those modules. Our platform lets enterprises to add or modify their training in a few hours and quickly deploy it to a target group of employees.”
Uptale has its own keeping up to do. As more hardware manufacturers launch new devices for VR and XR, market demand rises and enterprises take notice. After a successful pilot, enterprises want to deploy the devices on a large scale, so Uptale has to be ready to go big inside those companies. That entails working closely with IT departments to ensure a smooth, secure deployment of, say, the latest VR headsets. It has to be easy to order them, to connect them securely to Wi-Fi, to download the apps and content, and to configure everything for use.
The Lenovo connection to Snapdragon Spaces
As part of that work, Uptale is in the process of porting their application to many of the most popular VR headset brands. On the way to going big within one enterprise customer, they had the chance to work closely with the team at Lenovo and port their player app to the Snapdragon Spaces XR developer platform that runs on the Lenovo Think Reality VRX headset. “It was easy to integrate the Snapdragon Spaces platform with our app, which is built on Unity,” says Leang. “Thanks to Snapdragon Spaces compatibility with XR Plugin management and XR interaction toolkit, the integration was smooth, the port to the Lenovo VRX didn’t take long and the new version of the app performs very well. We were able to launch quickly at feature parity with the previous version.” Since Uptale specializes in virtual reality, they’re concentrating on the VR capabilities of Snapdragon Spaces, such as managing the camera and controller interaction on these next-generation VR headsets. The developer team is also integrating Hand Tracking feature so users can manipulate 3D assets directly with their hands, without relying on controllers. Leang plans on integrating Uptale’s workforce training with the AR capabilities in Snapdragon Spaces. “For the future, we plan on adding AR to our platform,” he says. “We look forward to including the features of Snapdragon Spaces, including Positional Tracking, Local Spatial Anchors, and Object Recognition.”
XR experience highlights:
- Training operational teams. The Uptale platform abstracts training content beyond physical constraints through interactive VR modules.
- Updating smoothly and easily. Uptale enables trainers and content creators to respond to rapidly evolving needs of employee training in hours, not weeks.
- Keeping up with new devices. Even as hardware manufacturers release new devices for VR and XR, Uptale stays ready for deployment in enterprise-scale training programs.
“Qualcomm Technologies has been very responsive in improving the code and helping us integrate Snapdragon Spaces, which our engineers find powerful yet easy to use.
The platform is a safe bet for the future. Since Snapdragon platforms are present on most of the headsets on the market, you can’t go wrong by integrating with Snapdragon Spaces.”
– Sebastien Leang, Co‑founder and CTO of Uptale.
High-Performance AR and VR for enterprise by Hololight
Hololight’s Stream SDK offloads compute-intensive tasks and enables streaming to improve the performance on mobile XR devices. The company collaborates with Snapdragon Spaces™ to simplify the integration of its products with a wide range of devices.
The more you develop and experiment with AR and VR applications, the more you see their potential in the industry. Enterprise customers have high expectations for both performance and cross-platform compatibility, constantly raising the bar for applications.
Hololight developed the Stream SDK for real-time streaming of AR/VR apps to offload the rendering of compute-intensive images from mobile devices to a powerful cloud infrastructure or local servers. This solves the issue of XR applications with high polygon content and complex datasets overwhelming the computing resources of most mobile XR devices like head-mounted displays (HMD) and glasses. Streaming also allows to run real-time XR applications on other types of devices, like smartphones and tablets. Hololight’s Stream SDK addresses the market need to bridge the performance gap between the potential of enterprise XR applications and the resources of most XR hardware. Its cross-platform compatibility and device-agnostic approach support a variety of device types, regardless of on-device computing resources.
Stream SDK is also integrated into Hololight’s Space, an XR engineering application for AR and VR devices that streamlines and accelerates product development and prototyping through 3D CAD data visualization. Space allows engineers and designers to easily import their 3D computer-aided design (CAD) data, then work with high-quality 3D content in XR, merging physical parts with virtual objects.
Through the Pathfinder Program, Hololight has brought their Stream SDK to Snapdragon Spaces. “This will enable streaming of all XR applications, such as Space, to devices like the Lenovo ThinkReality A3,” says Philipp Landgraf, Senior Director XR Streaming at Hololight. “Because Snapdragon runs on a wide range of devices, developers using the Stream SDK with Snapdragon Spaces can implement high fidelity XR applications with cross-device compatibility.” One of the biggest challenges Hololight faces is the differences among various hardware devices, for example the different types of controls – e.g. hand tracking or physical controllers. Those differences pose a challenge for a developer trying to build, test and run an XR application, lengthening time to market. Use of Snapdragon Spaces with Stream helps to streamline the process.
“Snapdragon Spaces standardizes the device features to a certain extent, so the devices operate similarly.” says Landgraf, “That makes it easier for developers to integrate our SDK more smoothly. Plus, Snapdragon Spaces enables lightweight devices that are suitable for consumer and everyday use. Since the devices are the first to support 5G natively, we can address new use cases – especially mobile.”
XR experience highlights:
- High-performance streaming. The combination of Stream SDK and Snapdragon Spaces offloads compute-intensive rendering tasks from mobile devices to the cloud or to local servers.
- Cross-platform compatibility. Hololight can make the Stream SDK, its industrial XR streaming solution, accessible to users of more and varied device types.
“We are pleased to have a strong partner in Qualcomm Technologies, so we can combine our technologies Stream and Snapdragon Spaces to get XR to the next level.
Snapdragon Spaces is an important driver for the entire industry, ensuring continued advancement and reliable development and enabling the next generation of AR glasses.”
– Philipp Landgraf, Senior Director XR Streaming, Hololight
Reality Reimagined: Trigger XR x Duran Duran
Trigger XR and the Snapdragon Spaces team wanted to invent a new and unique way for fans to enjoy and engage with Duran Duran’s music in the new medium. We sat down with Jason Yim, Trigger XR CEO to learn more about his company and the joint project.
Trigger XR is one of the world’s most experienced agencies in augmented reality and mixed reality. We started as a digital agency focused on film marketing. In 2009 we created our first augmented reality campaign for Sony’s “District 9”. And in 2012, we became Qualcomm’s showcase developer for their mobile AR platform – Vuforia, when it was still under R&D. Together, we have worked on legacy brands such as American Apparel, LEGO, Sesame Street, and McDonald’s. In 2016 we fully focused on XR and have amassed over 300,000 hours in the discipline.
We believe XR will change how we interact with the world and redefine the human/digital experience. We aim to become the premier full-service XR and metaverse solution provider and help the world’s top brands reinvent their customer experiences for the new 3D world. And our mission is to invent XR and metaverse solutions that perform and scale for the world’s top brands by mastering broad capabilities built upon deep technology partnerships.
Creating “Reality Reimagined” experience
Qualcomm Technologies, Trigger XR, and Duran Duran partnered to produce an interactive XR music experience — Reality Reimagined — using Lenovo ThinkReality A3 AR glasses and launched it at the Qualcomm Technologies’ annual Snapdragon Summit in 2022. Smart glasses enable an immersive experience that transforms the spaces around the users into a virtual galaxy using AR and spatial audio. The experience, powered by Snapdragon Spaces™ SDK, blends the real world with the surreal and draws inspiration from Duran Duran’s musical future and past. The result is an out-of-this-world experience that takes fans on a journey into a futuristic galaxy and introduces a new layer of immersion to the music experience.
The Trigger XR team was highly honoured and excited to collaborate with Duran Duran directly, especially Nick Rhodes, the band’s de facto CTO. Working with Qualcomm Technologies’ solutions, our collective goal from the beginning was to create a musical experience in HMD that was completely different from any other form of media. The band encouraged us to design an augmented reality experience that allowed fans to manipulate their music through musical stems and a custom sound design that was spatially interactive.
Using Snapdragon Spaces to bring ideas to life
We wanted to give the fans agency in the experience and make the music and experience spatial to play to the strengths of AR. We wanted to showcase the advantages of HMD by using hand tracking as the primary interaction. The ultimate goal was to inspire the fans to explore more songs, uncover more bands in this format, and experiment with HMDs in general.
Process and challenges when creating for headworn AR
Every project starts with a research and strategy phase. With HMDs, it’s critical to set technical parameters early that factor in the hardware capabilities of the specific headset, and because Snapdragon Spaces is evolving so quickly, that factors in the future capabilities of the software. The Trigger XR and Snapdragon Spaces teams then worked hand-in-hand with Duran Duran, a technologically innovative band, to conceptualise an experience that would genuinely surprise and engage their loyal fans.
The design phase is also unique for HMD projects because most creation tools are 2D screen-based. However, we’ve found that starting designs in the headset is critical to establishing good UX down the road. We use Shapes XR , a VR and AR design tool, to sketch the initial UX and visual UI. This tool helps our teams quickly understand field-of-view limitations and how content feels spatially. This early work becomes the foundation we build upon using our standard desktop-based design workflow.
The goal was to iterate as quickly as possible, make technical changes, and stress test those modifications “in the headset” to see what worked comfortably with the band. An idea may sound fun on paper, but until it is actually tested with the core band members, it’s impossible to know.
AR experience highlights:
The Snapdragon Spaces SDK allowed us to develop a project for music fans to listen to and interact with their favourite music. Using the Hand Tracking feature, users can play and use the instruments the Duran Duran band used in their famous song “Planet Earth.” By tapping the so-called “Nodes” inside the experience, users can not only toggle instruments on/off but also change the instrument’s sound to Reverb or Echo. But that’s not all – the nodes can also change the user’s surrounding environment using particle effects and other nifty VFX. The experience turned out magical, unlike anything else out there.
“We’ve always been passionate about exploring exciting ways to merge our music with new technology. The augmented reality experience, created by Snapdragon Spaces and Trigger XR was interesting to us because it offers fans a new platform to experience our catalogue through sound and touch in an immersive and interactive virtual space.
This is just the beginning, and we are curious to see how this story develops.”
– Duran Duran
Next-level AR workflows for creative professionals by Nomtek
Nomtek is a digital product development studio that explores new ways of innovating the workspace using XR. With their StickiesXR prototype, the company strives to make brainstorming and collaboration effortless and engaging.
“As a remote-first company, we often brainstorm online to develop concepts, ideas, and the direction we want to pursue. But remote workshops mean hours spent sitting in front of a computer and little movement,” says Łukasz Kincel, Head of Innovation at Nomtek. The team started wondering how they can combine remote and on-site workshops. As a result, they integrated Snapdragon Spaces™ XR Developer Platform with trusted collaboration product such as Figma.
The idea was soon realized as StickiesXR. Currently compatible with Lenovo Think Reality A3 and Meta Quest Pro, StickiesXR transforms the traditional 2D FigJam workflow into an extended reality experience. The solution allows users to walk around the room and brainstorm with virtual sticky notes, just like one would do in the physical workspace. Thanks to Snapdragon Spaces Hand Tracking feature, notes can be easily manipulated, using natural gestures. StickiesXR also synchronizes notes in real time, making sure everyone in the team stays on the same page. Furthermore, Local Anchors feature allows the experience to realistically co-exist in physical environment. “Our intention was to bring immersive augmented reality work environments closer to creative professionals,” Łukasz adds.
Leaning into the flexibility that smart glasses provide, StickiesXR gives users the freedom to move around while they work on ideas. Nomtek’s goal is to empower distributed teams to collaborate efficiently across geographical constraints. The company hopes the FigJam prototype will boost creativity and help professionals increase their productivity – in the office or at home.
AR experience highlights:
- Quick access. A virtual brainstorming session is just a few simple steps away – all it takes is to connect Lenovo A3 ThinkReality smart glasses to Motorola edge+ phone and launch the application.
- Hand gestures. With Lenovo A3, users don’t need to use external controllers — they use hand gestures to navigate.
- Easy set up. StickiesXR integration uses an easy-to-add plugin. The setup is also simple: just enter a three-digit code.
“We’re firm believers in ‘the right technology for the right use case’ rule. When it comes to immersive collaboration for distributed teams, AR technology and smart glasses open new opportunities.
Snapdragon Spaces has a promising roadmap, and the fact that it evolves the SDK with other developers — listening and incorporating their feedback — makes us confident that the platform will have a meaningful impact on the XR industry.”