All Dual Render Fusion

Layering spatial XR experiences onto mobile apps with Dual Render Fusion

Snapdragon Spaces Blog

Layering spatial XR experiences onto mobile apps with Dual Render Fusion

We are still buzzing with excitement around our announcement of Dual Render Fusion at AWE 2023 – the new feature of Snapdragon Spaces™ XR Developer Platform. We’ve got a number of developers already building apps with Dual Render Fusion and can’t wait to see what you will create. But first, let us show you how it works.

June 28, 2023

Think about transforming your 2D mobile application into a 3D spatial experience. Dual Render Fusion allows a Snapdragon Spaces based app to simultaneously render to the smartphone and a connected headworn display like the ThinkReality A3 included in the Snapdragon Spaces dev kit. In this setup, the smartphone can be used as a physical controller and a primary display (e.g., to render user interfaces and conventional 3D graphics). The smartphone is also connected to a headworn XR display which provides a secondary spatial XR view in real-time.

From a technical standpoint, the app now has two cameras (viewports) rendering from the same scene graph in a single Activity. The image below shows how developers can now enable two cameras in a Unity project with Dual Render Fusion by selecting each Target Eye in the Game Engine Inspector:

Once enabled, you can then preview the cameras. As shown below, Dual Render Fusion running in Unity on an emulator and how perception can be effectively layered onto the 2D smartphone experience.

The left viewport shows the 3D scene view in Unity. The middle shows a simulation of the primary (on-smartphone) display with the 3D scene and user interface controls. The right viewport simulates the rendering for the secondary (headworn XR) display, in this case, with simulated Hand Tracking enabled. The simplicity of this is that the cube can be manipulated by the smartphone touchscreen controls or by perception-based hand tracking without any networking or synchronization code required.

Layering on spatial experiences for your users

Today, users are comfortable with their current smartphone experiences. But just as the first smartphones drove a paradigm shift in human machine interaction (HMI), we’re now at an inflection point where XR is driving a shift towards spatial experiences which enhance users’ daily lives.
That’s why we believe Dual Render Fusion is so important. You can use it to introduce your users to spatial concepts while taking full advantage of existing smartphone interactions. Apps can now be experienced in new immersive ways while retaining familiar control mechanisms. Best of all, your users don’t have to give up their smartphones. Instead, they can reap the benefits of spatial XR experiences that you’ve layered on.

A great example of this is Virtual Places by , demoed at our AWE 2023 booth. Using Snapdragon Spaces and Dual Render Fusion, Virtual Places enhances conventional 2D map navigation with a 3D spatial view. Users can navigate maps with their smartphone while gaining visual previews of how an area actually looks. The multi-touch interface allows a familiar pinch and zoom interaction to manipulate the maps view on the phone, while a 3D visualization provides an immersive experience in the glasses.

Table Trenches by DB Creations , also demoed at our AWE 2023 booth, provided an example of a game that integrated Dual Render Fusion with relative ease.

DB Creations co-founder Blake Gross had this to say about the process: “…with Fusion, we are able to create experiences that anyone, without AR experience, can pick up and use by utilizing known mobile interaction paradigms. With Table Trenches, this was especially useful, because we were able to take the UI out of the world and into a familiar touch panel. Additionally, Fusion enables the smartphone screen to be a dynamic interface, so we can change how the phone behaves. We do this in Table Trenches when switching between surface selection and gameplay. Fusion was easy to integrate into our app since it uses the familiar Unity Canvas. Since our game was already built to utilize the Unity Canvas, it was as simple as setting up a new camera, and reconfiguring the layout to best fit the Snapdragon Spaces developer kit. We noticed at AWE how easy it was for new users to pick up and play our game without us needing to give any manual explanation of what to do.”

Developing With Dual Render Fusion

With the number of global smartphone users estimated at around 6.6 billion in 2031, there is a large market for mobile developers to tap into and offer new experiences. Snapdragon Spaces and Dual Render Fusion facilitate rapid porting of existing Unity mobile apps and demos to XR. You can create or extend existing 2D mobile applications built with Unity into 3D XR experiences with little or no code changes required just to get started. The general process goes like this:

  • Create a new 3D or 3D URP project in Unity.
  • Import the Snapdragon Spaces SDK and Dual Render Fusion packages.
  • Configure the settings for OpenXR and Snapdragon Spaces SDK integration.
  • Use the Project Validator to easily update your project and scene(s) for Dual Render Fusion with just a few clicks.
  • Build your app.

You can read more about the process in our Dual Render Fusion Scene Setup guide for Unity.

Download now

Ready to start developing with Snapdragon Spaces and Dual Render Fusion? Get started with these three steps:

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

All AR Dual Render Fusion

Dual Render Fusion is now available for mobile developers

Snapdragon Spaces Blog

AR’s inflection point: Dual Render Fusion feature is now available for mobile developers

Last week at AWE 2023, we introduced Dual Render Fusion – a new feature of Snapdragon Spaces™ XR Developer Platform designed to help developers transform their 2D mobile applications into spatial 3D experiences with little prior knowledge required.

June 6, 2023

What is Dual Render Fusion?

Snapdragon Spaces Dual Render Fusion enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality. In practical terms, a smartphone acts as a controller for AR experiences, letting users select what they want to see in AR using familiar mobile UI and gestures. Imagine you are using your go-to maps app for sightseeing. With Dual Render Fusion, you can use the phone as usual to browse the map and at the same time, see a 3D reconstruction of historical places in AR.

Why use Dual Render Fusion in your AR experiences?

The feature makes it easier for developers to extend their 2D mobile apps into 3D spatial experiences without creating a new spatial UI. It’s also the first time in the XR industry when AR developers get the tool to combine multi-modal input with simultaneous rendering to smartphones and AR glasses. With Dual Render Fusion, Unity developers with little to no AR knowledge can easily add an AR layer to their existing app using just a few extra lines of code. The feature gives more control over app behavior in the 3D space, significantly lowering the entry barrier to AR. But that’s not all – while using the feature, you have the option to utilize all available inputs enabled with the Snapdragon Spaces SDK, including Hand Tracking, Spatial Mapping and Meshing, Plane Detection, and Image Tracking, or go all in utilizing the convenience of the mobile touch screen for all input.

Why it is important?

It takes a lot of learning for developers to break into mobile AR and even more so to rethink what they already know to apply spatial design principles to headworn AR. The same applies to the end users as they need to get familiar with new spatial UX/UI and input. Enabling the majority of developers to create applications that are accessible to smartphone users will unlock the great potential of smartphone-based AR. The Snapdragon Spaces teams have been working hard to reimagine smartphone AR’s status quo and take a leap forward to fuse the phone with AR glasses. The Dual Render Fusion feature allows just that – to blend the simplicity and familiarity of the smartphone touch screen for input while leveraging the best of augmented reality. Dual Render Fusion unlocks smartphone-powered AR to its full potential, allowing us to onboard and activate an untapped market of millions of mobile developers. Hear our Director of XR Product Management Steve Lukas explain the vision behind the groundbreaking feature:

Download now

Dual Render Fusion (*experimental) is now available in beta as an optional add-on package for Snapdragon Spaces SDK for Unity version 0.13.0 and above. Download today and browse our Documentation to learn more. Don’t forget to share your feedback and achievements with our XR community on Discord.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.