Game Development across VR and AR with Survios
Survios produces and publishes engaging, multi-platform games across popular, licensed franchises and its own original properties. The company is expanding its game development from VR into AR, bringing more features and deeper engagement to its interactive experiences.
(c) Puzzle Bobble Tech Demo
Originally a hardware company, Survios pivoted to software and VR then developed Raw Data which became hugely popular on Steam. They redoubled their efforts in VR with innovations like sprint vectors and fluid locomotion to improve users’ control over movement. The company has worked with a wide range of fan-favorite intellectual property including such franchises as The Walking Dead, Westworld, Creed and Rocky.
Expanding into the new frontier of AR
In looking at technology trends and user preferences, Survios began to explore incorporating AR to their game development, with all the questions that would entail. “First of all, in AR, there’s no lead technology, hardware-wise,” says Mike Domaguing, Senior Vice President of Partnerships and Publishing at Survios. “There are hopes and dreams, but until developers have hardware in their hands, realizing those hopes and dreams is hard. Then, there are gameplay considerations with AR. Players want to be inside the game physically, but because of hardware limitations, we have to give them the ‘looking-down’ perspective on the game. Finally, there’s the question of where AR is going. Will it be PC- or mobile-oriented? That’s as much a business decision as a technical decision.” In short, Survios felt that the future of AR, especially in gaming, is very bright – although not without challenges.
Taking Snapdragon Spaces into AR game development
The company’s collaboration with Qualcomm Technologies became a strong source of encouragement for their AR initiative. Survios began working with the Snapdragon Spaces™ Extended Reality SDK and decided to use it for Puzzle Bobble 3D: Vacation Odyssey. They have built a compelling game experience around the tech stack of Snapdragon Spaces, OpenXR and Unreal Engine. “In developing Puzzle Bobble for AR, we used several features of Snapdragon Spaces,” says Alex Silkin, Chief Technology Officer and co-founder of Survios. “That includes Plane Detection and Hit Testing for locating a suitable surface to anchor the play area. Then we use Local Spatial Anchors to attach the play area to the real-world surface and maintain spatial consistency.”
For user interaction, the game takes advantage of Hand Tracking feature so that players can manually position the play area where they want it. “Players can touch bubbles both in the menu and within the game to make selections,” says Silkin. “That makes for intuitive, engaging user experience. We used OK gesture detection so players can confirm their choices with a simple hand gesture. Then, by pressing on the touch screen, players make the character Bob throw bubbles in the game. For that we used the Companion Controller Android application as a way to press buttons quickly and easily.”
Making the most of XR development
Survios has these tips for developers who are new to XR:
- Consider the Companion Controller as an additional form of input and output that is unique to Android. Survios experimented only briefly with using the phone as an input device but believes there is ample opportunity for using the phone display as an additional user interface.
- Use player’s head direction to aim character orientation. The studio found it as intuitive as Hand Tracking and less cumbersome than obliging players to have their hands up in the playing area all the time.
- Have a way – other than installing on the hardware – to quickly develop and test your core application mechanics. An example is to test core functions on a simple PC with a mouse and keyboard.
AR experience highlights:
- Developing more compelling games. Survios sees AR as an avenue for deeper engagement and better storytelling.
- Making the transition into AR. Survios is keen to extend its technical reach to more areas of extended reality.
- Enabling content for more devices Snapdragon Spaces is Survios path to reducing friction in going between AR and VR and bringing more enjoyment to players.
“We enjoy developing in Snapdragon Spaces because Qualcomm Technologies is committed to innovation and technology. They listen to developers, deliver features that developers want and give developers a good tour of those features. We work on a future project to enable creators to bring amazing experiences to fruition – that’s why our work with Snapdragon Spaces is important to us.”
– Mike Domaguing, SVP Partnerships and Publishing, Survios
Democratized XR and digital twins for frontline workers by DataMesh
DataMesh’s Director is an enterprise platform for creating digital twin content. Using the Snapdragon Spaces™ XR Developer Platform, DataMesh takes its digital-twin applications to a variety of AR glasses for training, guidance and operational planning in industry.
To bridge the gap between the digital and real worlds, DataMesh builds products that democratize XR and digital twins. By making those technologies accessible, the company empowers front-line workers to improve their capabilities, acquire knowledge and easily develop critical skills.
A platform for XR and digital twins
DataMesh offers FactVerse platform designed for building Training, Experience, Monitor and Control, and Simulation (TEMS) scenarios around XR. FactVerse includes applications such as DataMesh Director, DataMesh Inspector and DataMesh Checklist that support various platforms like HoloLens, Android and PC.
DataMesh Director enables companies to quickly convert existing 3D assets (e.g.CAD/BIM files) into 3D digital twin content. Users of DataMesh Director can create standard operating procedures, support guides and 3D product manuals, and monitor robots and equipment in real time. One enterprise used DataMesh Director to create mixed reality manuals and added AI for a 25-percent boost in customer service productivity and remote access to product demos. Another enterprise customer used DataMesh Director to import building information modeling (BIM) data and visualize it in 3D. The result was a digital twin that mapped construction sites accurately.
A big part of delivering satisfactory XR lies in the software development process. That includes optimizing 3D models, particularly in industrial applications like manufacturing and construction. Few XR devices can render those large-scale models smoothly and efficiently, so careful optimization is required. Plus, those devices depend on integration with software modules for interactions like sound, gesture controls and click-based events.
Snapdragon Spaces and the Pathfinder program
DataMesh was drawn to the Snapdragon Spaces XR Developer Platform to improve its offering in XR rendering and interactive performance. The team was also impressed by the well-developed Snapdragon Spaces ecosystem, which encompasses developers, enterprises and channel partners and provides a network of resources to support their XR initiatives. The company takes advantage of the compatibility of Snapdragon Spaces with Unity, enabling their developers to push the boundaries of augmented reality and easily share their creations. The company has combined the mature functionality of the DataMesh platform with Snapdragon Spaces features like Hand Tracking and using a mobile device as a controller. DataMesh Director, when integrated with the Snapdragon Spaces SDK, adapts to the user’s environment and creates an immersive work session. As a result, users can create interactive digital-twin content and make AR experiences more realistic and responsive.
Snapdragon Spaces also solves a problem DataMesh has with headset adaptation. Many headset SDKs have different requirements for specific Unity versions and platforms, prolonging DataMesh’s development cycle and increasing the cost of maintenance. When DataMesh uses the Snapdragon Spaces SDK to adapt its products to XR devices, they eliminate the need to work separately with each headset’s SDK, saving time and money. The Snapdragon Spaces SDK has allowed DataMesh to expand the availability of their software to other XR devices, including the Lenovo ThinkReality A3. DataMesh’s collaboration with and participation in the Pathfinder Program have given the company access to invaluable support and technical expertise, significantly accelerating its development process.
AR experience highlights:
- Democratizing digital twins. The DataMesh FactVerse platform makes digital twins more accessible to frontline workers and creators, addressing workflow challenges in training, planning and everyday operations. DataMesh empowers ordinary people to use digital twins conveniently across a wide range of devices.
- Adapting to more XR devices. DataMesh is bringing digital twin applications to more AR glasses, supporting industrial applications while requiring minimal 3D and programming knowledge.
- Converting CAD assets into digital twin content. DataMesh Director enables companies to quickly convert existing assets like CAD/BIM files into 3D digital twin content.
“Our mission to democratize digital twins and XR for empowering frontline workers and driving enterprise innovation has been greatly supported through our collaboration with Snapdragon Spaces.
Their robust XR development platform is a perfect fit for our needs, offering exceptional reliability and stability for deploying our app across multiple platforms. We highly recommend Snapdragon Spaces to any enterprise seeking to create or enhance their XR solutions.”
– Hao Wu, CTO, DataMesh
Virtual Reality Workforce Training by Uptale
Uptale’s cloud platform enables enterprises to digitize field training and improve workforce efficiency using VR. Whether beginners or experts in VR, users can create and deliver their company’s immersive learning experiences in VR, then deploy them at scale and analyze the results.
The Uptale VR platform includes authoring tools that enterprises use to develop immersive training experiences from 360-degree videos. Content creators build out workforce training with a variety of interactions, such as quizzes, complex exercises, information panels and videos. They can add advanced features like voice recognition and interactive, 3D models that trainees can manipulate.
A large portion of Uptale’s customers use the platform for in-field instruction, such as safety training, standard operating procedures, quality field visits, sales training and even soft skills. One customer, a global automobile manufacturer, uses Uptale to digitize their standard operating procedures on the production line. Their instructors can train more effectively without having to be present on the plant floor, and trainees can avoid making mistakes or slowing down the line. Over the years, Uptale solution has proven that the adoption of XR technology can provide significant benefits for more than 200 companies such as Stellantis and Alstom. A training school manager at Stellantis declared that her teams successfully reduced training time by 2 and significantly increased their efficiency using VR.
Always having to keep up
“This is a good time to be in XR,” says Sebastien Leang, co‑founder and CTO of Uptale. “Customers in both education and the enterprise see the efficiency and ROI of XR training, which means that the market is maturing. We spend less time now convincing people of the value proposition.” Uptale does more than help its customers catch up with the XR market; it helps them keep up with their ever-changing training needs. “It’s important for our customers to keep their training up to date,” adds Sebastien. “When their procedures change or a new rule takes effect, customers don’t want to have to spend months re-developing their training modules. They want to be agile in creating and updating those modules. Our platform lets enterprises to add or modify their training in a few hours and quickly deploy it to a target group of employees.”
Uptale has its own keeping up to do. As more hardware manufacturers launch new devices for VR and XR, market demand rises and enterprises take notice. After a successful pilot, enterprises want to deploy the devices on a large scale, so Uptale has to be ready to go big inside those companies. That entails working closely with IT departments to ensure a smooth, secure deployment of, say, the latest VR headsets. It has to be easy to order them, to connect them securely to Wi-Fi, to download the apps and content, and to configure everything for use.
The Lenovo connection to Snapdragon Spaces
As part of that work, Uptale is in the process of porting their application to many of the most popular VR headset brands. On the way to going big within one enterprise customer, they had the chance to work closely with the team at Lenovo and port their player app to the Snapdragon Spaces XR developer platform that runs on the Lenovo Think Reality VRX headset. “It was easy to integrate the Snapdragon Spaces platform with our app, which is built on Unity,” says Leang. “Thanks to Snapdragon Spaces compatibility with XR Plugin management and XR interaction toolkit, the integration was smooth, the port to the Lenovo VRX didn’t take long and the new version of the app performs very well. We were able to launch quickly at feature parity with the previous version.” Since Uptale specializes in virtual reality, they’re concentrating on the VR capabilities of Snapdragon Spaces, such as managing the camera and controller interaction on these next-generation VR headsets. The developer team is also integrating Hand Tracking feature so users can manipulate 3D assets directly with their hands, without relying on controllers. Leang plans on integrating Uptale’s workforce training with the AR capabilities in Snapdragon Spaces. “For the future, we plan on adding AR to our platform,” he says. “We look forward to including the features of Snapdragon Spaces, including Positional Tracking, Local Spatial Anchors, and Object Recognition.”
XR experience highlights:
- Training operational teams. The Uptale platform abstracts training content beyond physical constraints through interactive VR modules.
- Updating smoothly and easily. Uptale enables trainers and content creators to respond to rapidly evolving needs of employee training in hours, not weeks.
- Keeping up with new devices. Even as hardware manufacturers release new devices for VR and XR, Uptale stays ready for deployment in enterprise-scale training programs.
“Qualcomm Technologies has been very responsive in improving the code and helping us integrate Snapdragon Spaces, which our engineers find powerful yet easy to use.
The platform is a safe bet for the future. Since Snapdragon platforms are present on most of the headsets on the market, you can’t go wrong by integrating with Snapdragon Spaces.”
– Sebastien Leang, Co‑founder and CTO of Uptale.
High-Performance AR and VR for enterprise by Holo-Light
Holo-Light’s ISAR SDK offloads compute-intensive tasks and enables streaming to improve the performance on mobile XR devices. The company collaborates with Snapdragon Spaces™ to simplify the integration of its products with a wide range of devices.
The more you develop and experiment with AR and VR applications, the more you see their potential in the industry. Enterprise customers have high expectations for both performance and cross-platform compatibility, constantly raising the bar for applications.
Holo-Light developed the ISAR SDK for real-time streaming of AR/VR apps to offload the rendering of compute-intensive images from mobile devices to a powerful cloud infrastructure or local servers. This solves the issue of XR applications with high polygon content and complex datasets overwhelming the computing resources of most mobile XR devices like head-mounted displays (HMD) and glasses. Streaming also allows to run real-time XR applications on other types of devices, like smartphones and tablets. Holo-Light’s ISAR SDK addresses the market need to bridge the performance gap between the potential of enterprise XR applications and the resources of most XR hardware. Its cross-platform compatibility and device-agnostic approach support a variety of device types, regardless of on-device computing resources.
ISAR SDK is also integrated into Holo-Light’s AR Engineering Space AR 3S (pronounced “air-ease”), an XR engineering application for AR and VR devices that streamlines and accelerates product development and prototyping through 3D CAD data visualization. AR 3S allows engineers and designers to easily import their 3D computer-aided design (CAD) data, then work with high-quality 3D content in XR, merging physical parts with virtual objects.
Through the Pathfinder Program, Holo-Light has brought their ISAR SDK to Snapdragon Spaces. “This will enable streaming of all XR applications, such as AR 3S, to devices like the Lenovo ThinkReality A3,” says Philipp Landgraf, Senior Director XR Streaming at Holo-Light. “Because Snapdragon runs on a wide range of devices, developers using the ISAR SDK with Snapdragon Spaces can implement high fidelity XR applications with cross-device compatibility.” One of the biggest challenges Holo-Light faces is the differences among various hardware devices, for example the different types of controls – e.g. hand tracking or physical controllers. Those differences pose a challenge for a developer trying to build, test and run an XR application, lengthening time to market. Use of Snapdragon Spaces with ISAR helps to streamline the process.
“Snapdragon Spaces standardizes the device features to a certain extent, so the devices operate similarly.” says Landgraf, “That makes it easier for developers to integrate our SDK more smoothly. Plus, Snapdragon Spaces enables lightweight devices that are suitable for consumer and everyday use. Since the devices are the first to support 5G natively, we can address new use cases – especially mobile.”
XR experience highlights:
- High-performance streaming. The combination of ISAR SDK and Snapdragon Spaces offloads compute-intensive rendering tasks from mobile devices to the cloud or to local servers.
- Cross-platform compatibility. Holo-Light can make the ISAR SDK, its industrial XR streaming solution, accessible to users of more and varied device types.
“We are pleased to have a strong partner in Qualcomm Technologies, so we can combine our technologies ISAR and Snapdragon Spaces to get XR to the next level.
Snapdragon Spaces is an important driver for the entire industry, ensuring continued advancement and reliable development and enabling the next generation of AR glasses.”
– Philipp Landgraf, Senior Director XR Streaming, Holo-Light
Reality Reimagined: Trigger XR x Duran Duran
Trigger XR and the Snapdragon Spaces team wanted to invent a new and unique way for fans to enjoy and engage with Duran Duran’s music in the new medium. We sat down with Jason Yim, Trigger XR CEO to learn more about his company and the joint project.
Trigger XR is one of the world’s most experienced agencies in augmented reality and mixed reality. We started as a digital agency focused on film marketing. In 2009 we created our first augmented reality campaign for Sony’s “District 9”. And in 2012, we became Qualcomm’s showcase developer for their mobile AR platform – Vuforia, when it was still under R&D. Together, we have worked on legacy brands such as American Apparel, LEGO, Sesame Street, and McDonald’s. In 2016 we fully focused on XR and have amassed over 300,000 hours in the discipline.
We believe XR will change how we interact with the world and redefine the human/digital experience. We aim to become the premier full-service XR and metaverse solution provider and help the world’s top brands reinvent their customer experiences for the new 3D world. And our mission is to invent XR and metaverse solutions that perform and scale for the world’s top brands by mastering broad capabilities built upon deep technology partnerships.
Creating “Reality Reimagined” experience
Qualcomm Technologies, Trigger XR, and Duran Duran partnered to produce an interactive XR music experience — Reality Reimagined — using Lenovo ThinkReality A3 AR glasses and launched it at the Qualcomm Technologies’ annual Snapdragon Summit in 2022. Smart glasses enable an immersive experience that transforms the spaces around the users into a virtual galaxy using AR and spatial audio. The experience, powered by Snapdragon Spaces™ SDK, blends the real world with the surreal and draws inspiration from Duran Duran’s musical future and past. The result is an out-of-this-world experience that takes fans on a journey into a futuristic galaxy and introduces a new layer of immersion to the music experience.
The Trigger XR team was highly honoured and excited to collaborate with Duran Duran directly, especially Nick Rhodes, the band’s de facto CTO. Working with Qualcomm Technologies’ solutions, our collective goal from the beginning was to create a musical experience in HMD that was completely different from any other form of media. The band encouraged us to design an augmented reality experience that allowed fans to manipulate their music through musical stems and a custom sound design that was spatially interactive.
Using Snapdragon Spaces to bring ideas to life
We wanted to give the fans agency in the experience and make the music and experience spatial to play to the strengths of AR. We wanted to showcase the advantages of HMD by using hand tracking as the primary interaction. The ultimate goal was to inspire the fans to explore more songs, uncover more bands in this format, and experiment with HMDs in general.
Process and challenges when creating for headworn AR
Every project starts with a research and strategy phase. With HMDs, it’s critical to set technical parameters early that factor in the hardware capabilities of the specific headset, and because Snapdragon Spaces is evolving so quickly, that factors in the future capabilities of the software. The Trigger XR and Snapdragon Spaces teams then worked hand-in-hand with Duran Duran, a technologically innovative band, to conceptualise an experience that would genuinely surprise and engage their loyal fans.
The design phase is also unique for HMD projects because most creation tools are 2D screen-based. However, we’ve found that starting designs in the headset is critical to establishing good UX down the road. We use Shapes XR , a VR and AR design tool, to sketch the initial UX and visual UI. This tool helps our teams quickly understand field-of-view limitations and how content feels spatially. This early work becomes the foundation we build upon using our standard desktop-based design workflow.
The goal was to iterate as quickly as possible, make technical changes, and stress test those modifications “in the headset” to see what worked comfortably with the band. An idea may sound fun on paper, but until it is actually tested with the core band members, it’s impossible to know.
AR experience highlights:
The Snapdragon Spaces SDK allowed us to develop a project for music fans to listen to and interact with their favourite music. Using the Hand Tracking feature, users can play and use the instruments the Duran Duran band used in their famous song “Planet Earth.” By tapping the so-called “Nodes” inside the experience, users can not only toggle instruments on/off but also change the instrument’s sound to Reverb or Echo. But that’s not all – the nodes can also change the user’s surrounding environment using particle effects and other nifty VFX. The experience turned out magical, unlike anything else out there.
“We’ve always been passionate about exploring exciting ways to merge our music with new technology. The augmented reality experience, created by Snapdragon Spaces and Trigger XR was interesting to us because it offers fans a new platform to experience our catalogue through sound and touch in an immersive and interactive virtual space.
This is just the beginning, and we are curious to see how this story develops.”
– Duran Duran
Next-level AR workflows for creative professionals by Nomtek
Nomtek is a digital product development studio that explores new ways of innovating the workspace using XR. With their StickiesXR prototype, the company strives to make brainstorming and collaboration effortless and engaging.
“As a remote-first company, we often brainstorm online to develop concepts, ideas, and the direction we want to pursue. But remote workshops mean hours spent sitting in front of a computer and little movement,” says Łukasz Kincel, Head of Innovation at Nomtek. The team started wondering how they can combine remote and on-site workshops. As a result, they integrated Snapdragon Spaces™ XR Developer Platform with trusted collaboration product such as Figma.
The idea was soon realized as StickiesXR. Currently compatible with Lenovo Think Reality A3 and Meta Quest Pro, StickiesXR transforms the traditional 2D FigJam workflow into an extended reality experience. The solution allows users to walk around the room and brainstorm with virtual sticky notes, just like one would do in the physical workspace. Thanks to Snapdragon Spaces Hand Tracking feature, notes can be easily manipulated, using natural gestures. StickiesXR also synchronizes notes in real time, making sure everyone in the team stays on the same page. Furthermore, Local Anchors feature allows the experience to realistically co-exist in physical environment. “Our intention was to bring immersive augmented reality work environments closer to creative professionals,” Łukasz adds.
Leaning into the flexibility that smart glasses provide, StickiesXR gives users the freedom to move around while they work on ideas. Nomtek’s goal is to empower distributed teams to collaborate efficiently across geographical constraints. The company hopes the FigJam prototype will boost creativity and help professionals increase their productivity – in the office or at home.
AR experience highlights:
- Quick access. A virtual brainstorming session is just a few simple steps away – all it takes is to connect Lenovo A3 ThinkReality smart glasses to Motorola edge+ phone and launch the application.
- Hand gestures. With Lenovo A3, users don’t need to use external controllers — they use hand gestures to navigate.
- Easy set up. StickiesXR integration uses an easy-to-add plugin. The setup is also simple: just enter a three-digit code.
“We’re firm believers in ‘the right technology for the right use case’ rule. When it comes to immersive collaboration for distributed teams, AR technology and smart glasses open new opportunities.
Snapdragon Spaces has a promising roadmap, and the fact that it evolves the SDK with other developers — listening and incorporating their feedback — makes us confident that the platform will have a meaningful impact on the XR industry.”
– Łukasz Kincel, Head of Innovation, Nomtek
Active AR games by forwARdgame
Having joined Snapdragon Spaces™ XR Developer Platfrom at the early stage, forwARdgame strives to bring the joy of physical games to the “Always On” generation – with the help of AR.
The company’s co-founders Tom Minich and Tim Friedland strongly believe that XR is the future of how we interact with the world and play. Merging physical and virtual worlds, the company develops AR games that let users actively move around, while they stay immersed in the gameplay.
“Computer, console and mobile games give players breath-taking experiences, keeping them engaged for hours. Yet those games lack physical activity and face-to-face interaction”, say co-founders. “We believe we are brave enough and open-minded to work with this novel technology and have the creativity it takes to build magical experiences that are only possible with XR.” forwARdgame had shared their thoughts and impressions about working with augmented reality at the Snapdragon Spaces developer panel at AWE:
AR experiences require more physical interaction from users than flat-screen games, so the company emphasizes making games appealing to the players. Interacting with digital content in the games is done through body movements. This helps to make players feel at home in augmented reality — specifically in headworn AR. The company relies on Positional Tracking and environmental awareness to make a strong connection between real-world objects and digital game elements.
One of the games forwARdgame built using Snapdragon Spaces SDK is FlinkAAR. In this game, the player controls a dragon by moving in the real world. As the dragon follows the player wherever they go, the goal is to lead their dragon through magic portals.
AR experience highlights:
- Using players’ bodies as a controller. The game embodies the principle of active AR – to play, the users need to move around to direct their dragon through magic portals and collect crystals.
- Stability of the AR experience. The experience’s sturdiness allows players to step into the magic island atmosphere, walk around it and engage without losing the sense of realism.
- Movement tracking + local SLAM. Leaning into these features allowed AR game elements to stay pinned to the real world, even when the camera quickly moves as players catch hoops and crystals.
- Stable 6DoF controller. The smartphone acts as an additional controller, enabling more interaction opportunities for the players.
“Seeing the dragon’s island and walking around it is a fundamental AR experience, but doing it right feels like magic. We are very happy with local anchors. They make a serious improvement to keeping the experience in place. We make games where players constantly move around, and tracking movement can be challenging.
Snapdragon Spaces SDK offers fantastic movement tracking and local SLAM. We can create a magical world around the players and link the virtual world to the real environment and to the players themselves. This base is critical, and we’ve got it with Snapdragon Spaces. As a bonus, we have an incredibly stable 6DoF controller – the phone.”
– Tom Minich, co-founder, forwARdgame
Real-time holographic presence by MATSUKO
MATSUKO is a deep tech company that uses a combination of AR and AI to develop realistic holograms for spatial digital communication.
Leaning into their gaming, AI, and human-robot interaction background, MATSUKO’s founders Maria Vircikova and Matus Kirchmayer want users to experience the magic of life-like holographic presence. To achieve this goal, the team has developed the world’s first holographic presence app that uses a simple flow to capture and stream people as holograms in real-time.
The smartphone or computer camera capture transmits a real-time three-dimensional holographic image of a person before processing through an advanced 3D rendering engine. It then delivers a ‘virtually there’ immersive experience and displays it in a virtual environment or overlays it in a real-world setting. Patents-pending deep learning algorithms transform 2D streams into 3D pixel by pixel. The company’s own neural networks learn to reconstruct a person, even the non-visible parts. Real-time or pre-recorded, the holograms could be scaled to their true size and represent the natural facial expressions and other non-verbal cues that regular online communication tools frequently lack.
Combined with Snapdragon Spaces™ SDK functionality, MATSUKO provides a user with an immersive experience of holographic meetings. The users can see volumetric holograms, grab and move them around as they wish (thanks to the Hand Tracking feature). Plane Detection allows users’ environment to be scanned, so the hologram can be easily positioned to match the real environment. Taking realistic interaction to the next level, MATSUKO’s holograms could be seated across the table – just like a real counterpart would.
MATSUKO’s patent-pending technology has attracted the interest of Europe’s leading mobile operators, including Deutsche Telekom, Orange, Telefónica, and Vodafone. After a successful proof of concept, the plan is to develop a European platform for holographic communication that leverages the capabilities of 5G to create realistic 3D imagery.
AR experience highlights:
- Simple setup. No need for pre-scans or avatar creation. Users can start on a single device and get accurate 3D capture.
- Realistic 3D Content. Meet your colleagues and friends in a virtual or real environment, see natural facial expressions and gestures.
- Early access to AR and VR for educators. Join beta program
“Snapdragon Spaces helped us to iterate and develop an immersive platform for smart glasses quickly. In the fast-paced environment of AR experiences, it was crucial to have an easy-to-use platform and device that help developers create new upgrades and improvements.
Efficiently moving with the development process was another powerful asset strongly encouraged by the team supporting Snapdragon Spaces. We need platforms like Snapdragon Spaces that add up to the current state of XR with their nice environment and stability.”
– Erik Gajdos, Head of Development, MATSUKO
Multiplayer world scale AR game by Mohx-games
Mohx-games specializes in games and in-depth AR experiences, bringing the best of two worlds in ”Soul Summoner” – a wizard-themed location-based game for AR glasses.
Having started in augmented reality around three years ago, the company currently focuses its endeavors on entertainment and game experiences. Big believers in AR, the Mohx-games team set a goal to make more people adopt the technology, while bringing users together and fostering real human interaction.
The adoption of new technology comes with a set of challenges. For one, the technology needs to be easy to use, but also compelling, bringing something new, useful or engaging to the user. And what better medium is there than games?
“Soul Summoner” is a cross-platform multiplayer game that allows up to 8 players not just passively observe, but to become truly immersed in the experience. Unlike other AR experiences that rely on computer vision or VPS solutions and can only be played in locations with easily identifiable objects or landmarks, “Soul Summoner” can be played almost everywhere (including parks and open rooms) – all while providing full immersion into an AR world. Players can take full advantage of the world by moving around the space freely, dodging spells and using shields as opposed to just tapping on the screen. “Soul Summoner” on AR glasses is currently an open beta with features being constantly added with the help of their Discord community alongside the mobile version.
AR experience highlights:
- Cross-platform. Users can play together in multi-player from any device. Finally mobile users can share the same experience, at the same time with users of AR glasses.
- World scale. “Soul Summoner” is a location-based game that leverages augmented reality and AI and allows users to have an immersive experience in the real world.
- Role-playing game (RPG). Become a character in real life – players can level up their characters, learn powerful spells and team up to fight demons with other wizards.
- Immersive map. Explore the real world to find and battle monsters in your area. The action takes place on the map and in your augmented reality fights.
- Create your own quest. Immerse yourself in a magical world through AR by venturing on a quest or creating your own storyline.
“The Mohx-games team came together to bring fantasy into reality through the use of new technologies. Early posts of our gameplay got the attention of T-Mobile US and Qualcomm Technologies, and we started working with Snapdragon Spaces SDK to bring our game to smart glasses.
Constantly pushing the envelope, we try all the things you can possibly imagine for our multi-player game. As soon as new features come out, we quickly jump in to test them in our experiences. While we still have more challenges, we quickly find solutions that are beneficial for the Snapdragon Spaces program and the entire community. Eventually, our goal is to help bring people together through the lens of augmented reality.”
– Eugene Walsh, founder and COO, Mohx-games
Live AR streaming by Beem
Beem is a company on a mission to change how people communicate — with the help of AR. Set out to become the next evolution in communications, the company allows users to beam holograms from one device to another in real-time.
“Communication is most effective when people are physically in one location. Over thousands of years, our subconscious developed requirements that are only fulfilled in a physical location,” says Janosch Amstutz, Beem CEO. It’s not only about words, facial expressions, or body language used. “The sense of presence is extremely important for building trust in communication,” explains Amstutz. And while numerous different technologies are used nowadays to communicate across distances, people are more dispersed than ever.
So how do you make sure that people connect meaningfully, and what is the true reason preventing them from doing so? According to Beem, it’s intimacy and credibility gaps that online connections cannot yet fill. To bridge those gaps, Beem enables communications by streaming “live” AR holograms that mimic physical presence.
The hologram appears in front of the viewer in their natural environment. “We are less of a design or gaming studio — we are solely focused on cracking that utility communication challenge,” adds Amstutz. True to its vision to become the next communications platform, the Beem app is built to be cross-platform and optimized to work on any AR glasses and mobile devices. The goal is to make the platform easily accessible for everyone.
AR experience highlights:
- Get your hologram ready. To work, users need to open the Beem app and position themselves in front of the camera to capture their full frame.
- Turning video into realistic experience. Beem’s computer vision algorithm segments the person in the video from the background, processes it in real-time, and packs it in short video clips. You can send either a pre-recorded hologram, or livestream a as a hologram to up to 1 million viewers (available in Beem for Business version).
- High-fidelity presence. To unpack the hologram, the viewer clicks the link and place AR content in their environment, scaling it up or down as they like. The human hologram carries more credibility than regular video messages, eliminating the need for heavy editing and special effects.
“Our ultimate ambition is to give users the ability to use Beem as their calling feature for AR. We came to work with the Snapdragon Spaces XR platform through the T-Mobile accelerator and have been working closely for the past few months to become an enabling application for AR glasses.
Building into Snapdragon Spaces tech using Unity SDK as an initial testbed was seamless and quick (three or four days). We’ve had an incredible amount of access, which has accelerated not just our deployment but also the understanding of the value of what Snapdragon Spaces can bring to us and what we can bring to the platform.”