Snapdragon Spaces Blog

AR’s inflection point: Dual Render Fusion feature is now available for mobile developers

Last week at AWE 2023, we introduced Dual Render Fusion – a new feature of Snapdragon Spaces™ XR Developer Platform designed to help developers transform their 2D mobile applications into spatial 3D experiences with little prior knowledge required.

June 6, 2023

What is Dual Render Fusion?

Snapdragon Spaces Dual Render Fusion enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality. In practical terms, a smartphone acts as a controller for AR experiences, letting users select what they want to see in AR using familiar mobile UI and gestures. Imagine you are using your go-to maps app for sightseeing. With Dual Render Fusion, you can use the phone as usual to browse the map and at the same time, see a 3D reconstruction of historical places in AR.

Why use Dual Render Fusion in your AR experiences?

The feature makes it easier for developers to extend their 2D mobile apps into 3D spatial experiences without creating a new spatial UI. It’s also the first time in the XR industry when AR developers get the tool to combine multi-modal input with simultaneous rendering to smartphones and AR glasses. With Dual Render Fusion, Unity developers with little to no AR knowledge can easily add an AR layer to their existing app using just a few extra lines of code. The feature gives more control over app behavior in the 3D space, significantly lowering the entry barrier to AR. But that’s not all – while using the feature, you have the option to utilize all available inputs enabled with the Snapdragon Spaces SDK, including Hand Tracking, Spatial Mapping and Meshing, Plane Detection, and Image Tracking, or go all in utilizing the convenience of the mobile touch screen for all input.

Why it is important?

It takes a lot of learning for developers to break into mobile AR and even more so to rethink what they already know to apply spatial design principles to headworn AR. The same applies to the end users as they need to get familiar with new spatial UX/UI and input. Enabling the majority of developers to create applications that are accessible to smartphone users will unlock the great potential of smartphone-based AR. The Snapdragon Spaces teams have been working hard to reimagine smartphone AR’s status quo and take a leap forward to fuse the phone with AR glasses. The Dual Render Fusion feature allows just that – to blend the simplicity and familiarity of the smartphone touch screen for input while leveraging the best of augmented reality. Dual Render Fusion unlocks smartphone-powered AR to its full potential, allowing us to onboard and activate an untapped market of millions of mobile developers. Hear our Director of XR Product Management Steve Lukas explain the vision behind the groundbreaking feature:


Download now

Dual Render Fusion (*experimental) is now available in beta as an optional add-on package for Snapdragon Spaces SDK for Unity version 0.13.0 and above. Download today and browse our Documentation to learn more. Don’t forget to share your feedback and achievements with our XR community on Discord.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.