Next-level AR workflows for creatives by Nomtek

Next-level AR workflows for creative professionals by Nomtek

Nomtek is a digital product development studio that explores new ways of innovating the workspace using XR. With their StickiesXR prototype, the company strives to make brainstorming and collaboration effortless and engaging.

“As a remote-first company, we often brainstorm online to develop concepts, ideas, and the direction we want to pursue. But remote workshops mean hours spent sitting in front of a computer and little movement,” says Łukasz Kincel, Head of Innovation at Nomtek. The team started wondering how they can combine remote and on-site workshops. As a result, they integrated Snapdragon Spaces™ XR Developer Platform with trusted collaboration product such as Figma.

The idea was soon realized as StickiesXR. Currently compatible with Lenovo Think Reality A3 and Meta Quest Pro, StickiesXR transforms the traditional 2D FigJam workflow into an extended reality experience. The solution allows users to walk around the room and brainstorm with virtual sticky notes, just like one would do in the physical workspace. Thanks to Snapdragon Spaces Hand Tracking feature, notes can be easily manipulated, using natural gestures. StickiesXR also synchronizes notes in real time, making sure everyone in the team stays on the same page. Furthermore, Local Anchors feature allows the experience to realistically co-exist in physical environment. “Our intention was to bring immersive augmented reality work environments closer to creative professionals,” Łukasz adds.

Leaning into the flexibility that smart glasses provide, StickiesXR gives users the freedom to move around while they work on ideas. Nomtek’s goal is to empower distributed teams to collaborate efficiently across geographical constraints. The company hopes the FigJam prototype will boost creativity and help professionals increase their productivity – in the office or at home.

AR experience highlights:

  • Quick access. A virtual brainstorming session is just a few simple steps away – all it takes is to connect Lenovo A3 ThinkReality smart glasses to Motorola edge+ phone and launch the application.
  • Hand gestures. With Lenovo A3, users don’t need to use external controllers — they use hand gestures to navigate.
  • Easy set up. StickiesXR integration uses an easy-to-add plugin. The setup is also simple: just enter a three-digit code.

“We’re firm believers in ‘the right technology for the right use case’ rule. When it comes to immersive collaboration for distributed teams, AR technology and smart glasses open new opportunities.

Snapdragon Spaces has a promising roadmap, and the fact that it evolves the SDK with other developers — listening and incorporating their feedback — makes us confident that the platform will have a meaningful impact on the XR industry. ”

– Łukasz Kincel, Head of Innovation, Nomtek

Categories
Uncategorized

Active augmented reality by forwARdgame

Active AR games by forwARdgame

Having joined Snapdragon Spaces™ XR Developer Platfrom at the early stage, forwARdgame strives to bring the joy of physical games to the “Always On” generation – with the help of AR.

The company’s co-founders Tom Minich and Tim Friedland strongly believe that XR is the future of how we interact with the world and play. Merging physical and virtual worlds, the company develops AR games that let users actively move around, while they stay immersed in the gameplay.


“Computer, console and mobile games give players breath-taking experiences, keeping them engaged for hours. Yet those games lack physical activity and face-to-face interaction”, say co-founders. “We believe we are brave enough and open-minded to work with this novel technology and have the creativity it takes to build magical experiences that are only possible with XR.”

forwARdgame had shared their thoughts and impressions about working with augmented reality at the Snapdragon Spaces developer panel at AWE:

AR experiences require more physical interaction from users than flat-screen games, so the company emphasizes making games appealing to the players. Interacting with digital content in the games is done through body movements. This helps to make players feel at home in augmented reality — specifically in headworn AR. The company relies on Positional Tracking and environmental awareness to make a strong connection between real-world objects and digital game elements.

One of the games forwARdgame built using Snapdragon Spaces SDK is FlinkAAR. In this game, the player controls a dragon by moving in the real world. As the dragon follows the player wherever they go, the goal is to lead their dragon through magic portals.

AR experience highlights:

  • Using players’ bodies as a controller. The game embodies the principle of active AR – to play, the users need to move around to direct their dragon through magic portals and collect crystals.
  • Stability of the AR experience. The experience’s sturdiness allows players to step into the magic island atmosphere, walk around it and engage without losing the sense of realism.
  • Movement tracking + local SLAM. Leaning into these features allowed AR game elements to stay pinned to the real world, even when the camera quickly moves as players catch hoops and crystals.
  • Stable 6DoF controller. The smartphone acts as an additional controller, enabling more interaction opportunities for the players.

“Seeing the dragon’s island and walking around it is a fundamental AR experience, but doing it right feels like magic. We are very happy with local anchors. They make a serious improvement to keeping the experience in place. We make games where players constantly move around, and tracking movement can be challenging.

Snapdragon Spaces SDK offers fantastic movement tracking and local SLAM. We can create a magical world around the players and link the virtual world to the real environment and to the players themselves. This base is critical, and we’ve got it with Snapdragon Spaces. As a bonus, we have an incredibly stable 6DoF controller – the phone.”

– Tom Minich, co-founder, forwARdgame

Real-time holographic presence by MATSUKO

Real-time holographic presence by MATSUKO

MATSUKO is a deep tech company that uses a combination of AR and AI to develop realistic holograms for spatial digital communication.

Leaning into their gaming, AI, and human-robot interaction background, MATSUKO’s founders Maria Vircikova and Matus Kirchmayer want users to experience the magic of life-like holographic presence. To achieve this goal, the team has developed the world’s first holographic presence app that uses a simple flow to capture and stream people as holograms in real-time.

The smartphone or computer camera capture transmits a real-time three-dimensional holographic image of a person before processing through an advanced 3D rendering engine. It then delivers a ‘virtually there’ immersive experience and displays it in a virtual environment or overlays it in a real-world setting. Patents-pending deep learning algorithms transform 2D streams into 3D pixel by pixel. The company’s own neural networks learn to reconstruct a person, even the non-visible parts. Real-time or pre-recorded, the holograms could be scaled to their true size and represent the natural facial expressions and other non-verbal cues that regular online communication tools frequently lack.

Combined with Snapdragon Spaces™ SDK functionality, MATSUKO provides a user with an immersive experience of holographic meetings. The users can see volumetric holograms, grab and move them around as they wish (thanks to the Hand Tracking feature). Plane Detection allows users’ environment to be scanned, so the hologram can be easily positioned to match the real environment. Taking realistic interaction to the next level, MATSUKO’s holograms could be seated across the table – just like a real counterpart would.

MATSUKO’s patent-pending technology has attracted the interest of Europe’s leading mobile operators, including Deutsche Telekom, Orange, Telefónica, and Vodafone. After a successful proof of concept, the plan is to develop a European platform for holographic communication that leverages the capabilities of 5G to create realistic 3D imagery.

AR experience highlights:

  • Simple setup. No need for pre-scans or avatar creation. Users can start on a single device and get accurate 3D capture.
  • Realistic 3D Content. Meet your colleagues and friends in a virtual or real environment, see natural facial expressions and gestures.
  • Early access to AR and VR for educators. Join beta program

“Snapdragon Spaces helped us to iterate and develop an immersive platform for smart glasses quickly. In the fast-paced environment of AR experiences, it was crucial to have an easy-to-use platform and device that help developers create new upgrades and improvements.

Efficiently moving with the development process was another powerful asset strongly encouraged by the team supporting Snapdragon Spaces. We need platforms like Snapdragon Spaces that add up to the current state of XR with their nice environment and stability.”

– Erik Gajdos, Head of Development, MATSUKO

Multiplayer world scale AR game by Mohx-games

Multiplayer world scale AR game by Mohx-games

Mohx-games specializes in games and in-depth AR experiences, bringing the best of two worlds in ”Soul Summoner” – a wizard-themed location-based game for AR glasses.

Having started in augmented reality around three years ago, the company currently focuses its endeavors on entertainment and game experiences. Big believers in AR, the Mohx-games team set a goal to make more people adopt the technology, while bringing users together and fostering real human interaction.


The adoption of new technology comes with a set of challenges. For one, the technology needs to be easy to use, but also compelling, bringing something new, useful or engaging to the user. And what better medium is there than games?

“Soul Summoner” is a cross-platform multiplayer game that allows up to 8 players not just passively observe, but to become truly immersed in the experience. Unlike other AR experiences that rely on computer vision or VPS solutions and can only be played in locations with easily identifiable objects or landmarks, “Soul Summoner” can be played almost everywhere (including parks and open rooms) – all while providing full immersion into an AR world. Players can take full advantage of the world by moving around the space freely, dodging spells and using shields as opposed to just tapping on the screen. “Soul Summoner” on AR glasses is currently an open beta with features being constantly added with the help of their Discord community alongside the mobile version.

AR experience highlights:

  • Cross-platform. Users can play together in multi-player from any device. Finally mobile users can share the same experience, at the same time with users of AR glasses.
  • World scale. “Soul Summoner” is a location-based game that leverages augmented reality and AI and allows users to have an immersive experience in the real world.
  • Role-playing game (RPG). Become a character in real life – players can level up their characters, learn powerful spells and team up to fight demons with other wizards.
  • Immersive map. Explore the real world to find and battle monsters in your area. The action takes place on the map and in your augmented reality fights.
  • Create your own quest. Immerse yourself in a magical world through AR by venturing on a quest or creating your own storyline.

“The Mohx-games team came together to bring fantasy into reality through the use of new technologies. Early posts of our gameplay got the attention of T-Mobile US and Qualcomm Technologies, and we started working with Snapdragon Spaces SDK to bring our game to smart glasses.

Constantly pushing the envelope, we try all the things you can possibly imagine for our multi-player game. As soon as new features come out, we quickly jump in to test them in our experiences. While we still have more challenges, we quickly find solutions that are beneficial for the Snapdragon Spaces program and the entire community. Eventually, our goal is to help bring people together through the lens of augmented reality.”

– Eugene Walsh, founder and COO, Mohx-games

Live AR streaming by Beem

Live AR streaming by Beem

Beem is a company on a mission to change how people communicate — with the help of AR. Set out to become the next evolution in communications, the company allows users to beam holograms from one device to another in real-time.

“Communication is most effective when people are physically in one location. Over thousands of years, our subconscious developed requirements that are only fulfilled in a physical location,” says Janosch Amstutz, Beem CEO. It’s not only about words, facial expressions, or body language used. “The sense of presence is extremely important for building trust in communication,” explains Amstutz. And while numerous different technologies are used nowadays to communicate across distances, people are more dispersed than ever.


So how do you make sure that people connect meaningfully, and what is the true reason preventing them from doing so? According to Beem, it’s intimacy and credibility gaps that online connections cannot yet fill. To bridge those gaps, Beem enables communications by streaming “live” AR holograms that mimic physical presence.

The hologram appears in front of the viewer in their natural environment. “We are less of a design or gaming studio — we are solely focused on cracking that utility communication challenge,” adds Amstutz. True to its vision to become the next communications platform, the Beem app is built to be cross-platform and optimized to work on any AR glasses and mobile devices. The goal is to make the platform easily accessible for everyone.

AR experience highlights:

  • Get your hologram ready. To work, users need to open the Beem app and position themselves in front of the camera to capture their full frame.
  • Turning video into realistic experience. Beem’s computer vision algorithm segments the person in the video from the background, processes it in real-time, and packs it in short video clips. You can send either a pre-recorded hologram, or livestream a as a hologram to up to 1 million viewers (available in Beem for Business version).
  • High-fidelity presence. To unpack the hologram, the viewer clicks the link and place AR content in their environment, scaling it up or down as they like. The human hologram carries more credibility than regular video messages, eliminating the need for heavy editing and special effects.

“Our ultimate ambition is to give users the ability to use Beem as their calling feature for AR. We came to work with the Snapdragon Spaces XR platform through the T-Mobile accelerator and have been working closely for the past few months to become an enabling application for AR glasses.

Building into Snapdragon Spaces tech using Unity SDK as an initial testbed was seamless and quick (three or four days). We’ve had an incredible amount of access, which has accelerated not just our deployment but also the understanding of the value of what Snapdragon Spaces can bring to us and what we can bring to the platform.”

– Janosch Amstutz, CEO, Beem

Hand Tracking and Companion Controller prototypes by Designium

Hand Tracking and Companion Controller prototypes by Designium

Designium is an award-winning technology and design company that merges technology and creativity to create new AR experiences. Selected in the first cohort of the Snapdragon Spaces™ Pathfinder program, Designium created four prototypes to experiment with various features.

The company has worked in AR and XR field since its early days and focuses on digital content applications that combine AR and VPS. As Kuan Ying Wu, Designium’s interactive engineer and designer, explains, developing demo prototypes begins with trying new technology samples to identify the features worth highlighting.

After shortlisting features, he creates quick sketches to generate three to five ideas. Sketches help identify one or two ideas most suitable for development and are then expanded into various scenarios. This approach also helps to think through the functionality and UX details of the experience.

Next up is testing basic functions to understand the target features and whether UX design can be implemented. The prototypes’s goal is to showcase the features, so the design is adjusted around the goal. In the final stage, assets and models are used to create mock-ups, combining functions and materials to make prototypes workable.

Using Snapdragon Spaces Unity SDK, Designium selected Hand Tracking and Companion Controller features. Four demo prototypes put features to test through various scenarios.

AR demos overview:

  • Fruit Scale: presents optical measurement combined with a user experience. The prototype educates users about the size of the objects in AR space.
  • Sushi Grab: advanced practice for showcasing the Hand Tracking feature.
  • Virtual Studio: shows how to lay out functional elements in the space using both hands. The same idea could be applied to interior design, urban planning, and architecture.
  • Hologram: shows the Companion Controller combined with the user’s habit of smartphone operation behavior. The feature provides added value when used to drive operational behaviour for AR glasses.

The Fruit Scale prototype shows the basic functionality of the Hand Tracking feature. Leaning into a familiar gesture that people use when showing the size of objects with hands, the experience recreates the same process with AR measuring overlay.

The Sushi Grab demo continues to explore the Hand Tracking functionality, with pinching and releasing gestures that interact with different objects in the environment.

The Virtual Studio presents an idea of interaction with different virtual objects like one can do in the Unity scene window. Using Hand Tracking and Unity MICH-L project, the experience shows how the camera, lighting, and particle system can move and rotate in real-time in AR space.

The Hologram prototype is built upon the basic functionality of the Companion Controller. The prototype explores a swipe behavior (common for smartphones) to control the hologram and make it show, hide and rotate. The standard input of the touchpad (XY axis) is utilized to provide extended control options. Users can swipe up and down to show or hide content and swipe left and right to rotate it.



“Snapdragon Spaces Unity SDK helps you quickly develop interesting and immersive AR experiences for smart glasses. Its Unity SDK is using AR Foundation and OpenXR as the underlying layer, which helps creator with experience in mobile AR development to get started. Although the platform is still evolving, each update has many surprising things, and the improvements are considerable. Many of the questions asked in the forums are resolved in the next update. The replies on the forum are very quick and the answers often help solve the problems immediately. ThinkReality A3 + Motorola Edge+ phone offer great AR experience hardware. Glasses have wide FOV, making them comfortable to wear and promising for creating experiences. We need such AR wearable devices to bring more different experiences to life in the future.”

– Kuan Ying Wu, Interactive Engineer/Designer, Designium

Snapdragon Spaces Pathfinder is a program of Qualcomm Technologies, Inc. and/or its subsidiaries.

Padres Hall of Fame AR experience by Rock Paper Reality

Padres Hall of Fame AR experience by Rock Paper Reality

Rock Paper Reality (RPR) developed an all-new immersive, multi-player experience for the San Diego Padres baseball team. Featured in the Padres Hall Of Fame at Petco Park, the experience was built with the Snapdragon Spaces™ XR Developer Platform, Motorola edge+, and Lenovo’s ThinkReality A3 smart glasses.

To play, up to four players can select a different colored ball that pops up from virtual podiums in front of the user. Once everyone has selected their ball, a virtual portal opens on the physical wall of the HOF, revealing a 1:1 representation of Petco Park and holograms of Padres’ Hall of Famers.

The goal is to hit baseballs through virtual rings floating throughout the stadium. Using head position to aim, players hit the ball by simply tapping the Motorola edge+ screen, and a baseball launches from A3 glasses. The easiest shots are scored as a single; the hardest ring to hit in the back is a grand slam. Each shot triggers a celebratory animation.

AR experience highlights:

  • Ease of use. RPR knew this would be the first time most people ever used smart glasses and augmented reality. Players don’t have to be a pro-gamer to score high, which creates a really fun atmosphere for anyone to play, whether it’s someone in their 80s, a child or a complete novice to virtual gaming.
  • Detailed 3D Content. For the agency, look and feel were essential in bringing experience to life. RPR wanted the stadium and players to be unmistakable. The team used a drone to capture thousands of photos of Petco Park and used photogrammetry to stitch them together. They also captured Petco Park as a reference to model the entire stadium, and then added 3D models and animation. Result: sci-fi sports center from the future meets the retro, throwback look and feel of the Padres.
  • Portal Augmented Reality. For the stadium-scale AR experience, the size of the environment seems the biggest constraint. To overcome it, the studio created a virtual portal that overlaid on the wall, allowing users a line of sight into the 3D Petco Park as if they were standing on home plate. While the space and UX required users to stay stationary, AR content and a virtual portal brought in the action.
  • Multiplayer. What better way to get a group of friends and families excited about AR and the Padres’ HOF than to play against one other in a high-action arcade-style game? Since not everyone wants to wear a headworn device, the backend allowed two players to play with the Lenovo A3 and two players on the Motorola edge+. All the devices are set up on a table, and all 4 players play against each other simultaneously. The user with the highest score gets a special AR trophy and champion bragging rights.
  • Spectator View. With the objective to entice everyone to play and providing an exciting show, RPR set up a webcam above the players’ backs that streamed the players’ view and the surrounding audience onto TV screens. Just as each player can see their opponent shooting balls, a spectator view from a “5th player perspective” live-streamed the game, adding excitement in real-time.

“We were thrilled when Motorola approached us to become their immersive technology partner and develop an AR experience for the Padres. RPR’s leadership spun out from headworn pioneer, ODG, and developing smart glasses experiences for almost 13 years. We’ve been developing AR headworn experiences on Snapdragon technology since 2015, starting with the Snapdragon 805.

The team supporting Snapdragon Spaces couldn’t be more helpful and easier to work with throughout the development process. Since we were pushing the boundaries of new AR hardware and software, we closely communicated with the team for creative brainstorming around technical challenges. Those calls often resulted in solutions and helped keep development progress moving ahead and on schedule.”

– Preston Platt, Rock Paper Reality CTO