Categories

AI-based fitness AR routine with Litesport

https://cdn.spaces.qualcomm.com/wp-content/uploads/2023/11/09104702/litesport.jpg

AI-based fitness AR routine with Litesport

Using Snapdragon Spaces™ XR Developer Platform capabilities, Litesport created an AI-based fitness application demo for Snapdragon Summit. The demo combined augmented reality, biometric feedback, and AI pose detection algorithms powered by ASENSEI. The cross-device AR experience takes advantage of a smartphone, a smartwatch and AR glasses ecosystem, all powered by Snapdragon® chipsets.

(c) Litesport

Litesport develops fitness applications that use virtual reality, mixed reality and augmented reality to enhance the way people exercise and stay active. Their Litesport application combines virtual reality with real workout modalities like Strength Training, Bootcamp and Boxing. The company’s immersive, interactive and personalized approach to fitness consistently merges the physical and virtual worlds.

Building a cross-device fitness app

Litesport saw the opportunity to create an artificial intelligence (AI) based fitness routine that would capitalize on the advantages of a smartphone, a smartwatch and AR glasses, leveraging the full ecosystem of devices powered by Snapdragon.

As members of the Snapdragon Spaces Pathfinder program, they worked with Qualcomm Technologies, Inc. to configure a cross-device solution powered by the Snapdragon family of mobile processors:

  • The smartphone is a reference device built on the Snapdragon 8 Gen 3 Mobile Platform. The phone runs software from ASENSEI that processes AI-based body pose data and provides real-time feedback and corrective actions.
  • The smartwatch, powered by the Snapdragon W5+ Gen 1 Wearable Platform, gathers the user’s biometric data in real time and transmits it to the smartphone for consolidation.
  • The wireless AR glasses run the Snapdragon AR2 Gen 1 Platform, adding the visual dimension of a personal trainer guiding the session.

  • The result is an immersive guided AR workout, where the user does not just passively watch content, but gets real-time feedback on their technique — just like in a real-life training session. The AI algorithm behind the experience leverages the smartphone camera (pointed at the user) to recognize person’s movements and show it back to the user during the session, enabling real-time contextual trainer feedback. As a result, users can ensure their form is correct and maximize their workout. Thanks to the connected smartwatch, the application also has access to biometric data, providing a comprehensive view of workout performance as well as personalized reminders and workout suggestions.


    Developing on Snapdragon Spaces

    To build as much interactivity as possible into their AR application, Litesport worked with the Snapdragon Spaces XR Developer Platform. The platform not only offers a stack of perception features that help developers create XR experiences, but also provides a single SDK with the functions and APIs needed for AR glasses. Litesport designed and built their AR application on the wired Lenovo ThinkReality A3. Then, because Snapdragon Spaces unifies development on AR glasses, they were easily able to conduct their demo at the Snapdragon Summit using wireless based reference-design glasses that support Snapdragon Spaces and Snapdragon AR2 technology. With writing limited code, Litesport was pleased to find that the reference-design glasses supported their app as they had designed it for the A3.

    “The fact that Snapdragon enables experiences built across devices, is huge”, says Jeffrey Morin, CEO and co-founder of Litesport. “It’s going to allow us to deploy on any new device that will come out in the future. With processing happening on the phone, devices can become increasingly lightweight, which is crucial for connected fitness.”

    “By getting accurate form tracking and correction through the Asensei software on the phone, we were able to create a ‘choose your own adventure’ style experience, where the trainer gave different coaching tips based on what you did live,” adds Morin.

    “For example, encouraging you to go deeper in your squat, or being mindful of leaning too far in one direction while doing lunges. This creates something more immersive, personalized and helpful to the user in terms of an AR fitness experience, like live 1:1 personal coaching, rather than just watching a single, standard, pre-canned workout. The AR glasses provide an unparalleled, real and clear representation of your personal space, and so the experience of seeing a live trainer in front of you is more magical, immersive and impressive, not to mention the form factor is much comfortable and practical for working out.”

    To promote cross-device experiences like Litesport, Snapdragon Spaces has begun working on a new initiative called Snapdragon Seamless™, combining the respective technical advantages of smartphones, AR glasses and wearables.

    AR experience highlights:

    • Hand Tracking for natural interaction. Experience uses Hand Tracking feature to allow users navigate UX and menu.
    • Plane Detection and Positional tracking for experience grounded in the real environment. The use of Plane Detection feature allows to realistically anchor virtual trainer to the floor, while Positional Tracking helps to fix the trainer and UI in the place relative to the world around you.
    • Dual Render Fusion allowed to use multiple displays and run two different UI flows on the phone screen and the glasses simultaneously.

    “In less than six months, we were able to build a functional demo that delighted the audience of Snapdragon Summit. The experience is designed to showcase the next-gen chip of the device that will be released. We worked with a new partner, ASENSEI and were able to put all components together on a platform we never used before thanks to the support we got from Snapdragon Spaces team.”

    – Jeffrey Morin, CEO and co-founder of Litesport