Categories
All Events

MIT Hack diaries: Community Canvas team on winning the track

Read on to learn about the team behind Community Canvas, this year’s winners of the Snapdragon Spaces track at MIT Reality Hack.

Snapdragon Spaces Blog

MIT Hack diaries: Community Canvas team on winning the track

By Tom Xia, Yidan Hu, Joyce Zheng, Mashiyat Zaman, and Lily Yu

March 05, 2024

Introduction

Hi! We’re the team behind Community Canvas, this year’s winners of the Snapdragon Spaces track at MIT Reality Hack. We’re a group of graduate students from the Interactive Telecommunications Program at NYU, where we’ve been working on anything and everything from building arcades to creating clocks out of our shadows. It was our first time participating in Reality Hack, so we were blown away by all the tech we had at our disposal, and incredible ideas that the teams around us came up with for them.

Our project Community Canvas, developed using Snapdragon Spaces developer kit using Lenovo ThinkReality smart glasses, is an AR application allowing users to redesign shared urban spaces with customized 3D assets. It includes a data platform providing insights on submissions for local governments, facilitating participatory budgeting to address community needs. We wanted to take you through our journey putting this idea together – it all started the afternoon of the opening ceremony, in Walker Memorial at MIT..

Thursday

Tom

As we stepped into the Reality Hack Inspiration Expo, we were struck by the number of sponsors showcasing an array of AR/VR headsets. It was a tech enthusiast’s dream come true. Among them, Snapdragon Spaces stand caught our eye, where the Lenovo ThinkReality A3 glasses lay in wait.

At first glance, the ThinkReality A3 seemed unassuming, closely resembling a pair of standard sunglasses but with a slightly thicker frame and bezels. The cable connecting the glasses to a smartphone piqued my curiosity about its capabilities. Slipping on the A3, the lightness of the frame was a surprise. A closer inspection revealed a second layer of glass nestled behind the front panel, hinting at its holographic projection capabilities.

The Snapdragon Spaces team demonstrated the built-in map navigation through both a traditional 2D phone screen and an immersive 3D visualization via the AR glasses. Manipulating the map on the phone with simple pinches to zoom and rotate, I watched as a three-dimensional landscape of buildings and trees came to life within the headset, confined only by a circular boundary. The seamless transition between 2D and 3D, paired with the intuitive control system, was a revelation. This experience challenged my preconceived notions about spatial navigation in VR, which typically involved either controller raycasting or somewhat awkward hand gesture tracking. The tactile feedback of using a touch screen in tandem with the visual immersion offered by the glasses created a rich, spatially aware experience that felt both innovative and natural. One aspect that prompted further inquiry was the glasses’ tint. It seemed a bit too dark, slightly obscuring the real world. In conversation with the Snapdragon Spaces team, I discovered the glasses’ lenses could be easily removed, allowing for customization of their opacity to suit different environments and preferences.

My overall initial encounter with the Lenovo ThinkReality A3 smart glasses was profoundly inspiring. It underscored the incredible strides being made in wearable XR technology, achieving a level of lightweight flexibility that seemed futuristic just a few years ago.

Friday

Yidan

On the first day of the hackathon, our team embarked on a journey of innovation fueled by collaboration and a shared vision for change. Initially, our focus gravitated towards conventional methods, that lead us to pondering their viability within the realm of XR technology. It was through our interactions with the team members of Snapdragon Spaces, that we truly began to grasp the potential of XR technology to impact lives in meaningful ways. The team encouraged us to leverage XR technology to transcend the constraints of the physical world, opening up possibilities for enhanced living experiences.

With diverse backgrounds and cultures represented within our team, we started to explore how XR technology could address real-world challenges. Drawing from my film production background, I raised the inefficiencies and costs associated with traditional film and television venue construction. Could XR technology streamline this process, allowing crews to visualize and plan sets digitally before physical construction? Meanwhile Tom, with roots in architecture and a connection to Shanghai’s disappearing heritage, pondered how XR could be used to preserve and communicate the stories of vanishing landmarks.

As discussions unfolded, two key themes emerged: community and communication. Armed with these guiding principles, we delved into research, exploring how Snapdragon Spaces technology could facilitate meaningful connections and empower communities.

Our team came up with two innovative solutions aimed at addressing community needs and fostering collaboration:


  1. AR Co-Design App:An augmented reality application that empowers users to reimagine and redesign spaces using customizable 3D assets. Whether it’s revitalizing a local park or transforming a vacant lot, this app enables communities to visualize and share their visions for change.
  2. Data Platform for Participatory Budgeting: A comprehensive platform that facilitates participatory budgeting to allow local communities allocate resources based on feedback. By providing insights and analytics on submitted proposals, this platform ensures transparency and accountability in decision-making processes.

After having these ideas, we immediately set to work, each team member contributing their expertise in different areas. Drawing from our diverse backgrounds, we divided tasks and collaborated to bring our vision to life. With a shared sense of purpose and determination, we were fueled by the belief that our solutions could truly make a difference in communities.

Saturday

Mashi

Having the Snapdragon Spaces team at the next table over was our lifesaver – with us the whole day, fielding all our troubleshooting questions. Designing for two user interfaces – both the phone and the smart glasses, each a different orientation – was quite unintuitive. In our drafts, we considered how we could ensure that the elements in either didn’t interfere with the other. For example, what if we thought of the phone interface as a remote control using familiar interactions like pressing and swiping, so that users could focus on experiencing the scene in front of them through the glasses?

While Lily and Tom investigated questions like these, modeling our ideas on Figma, Joyce and I wrestled with a more technical hurdle – how do we even implement these designs in Unity? In our combined experience developing in Unity so far, we never had to think about the user potentially looking at (or through) two screens at once! Luckily, the Snapdragon Spaces team was there to save the day. They recommended using the new Dual Render Fusion feature, which allows us to edit simultaneous renderings on both the mobile screen and smart glasses through the Unity editor. The team showed us that all we had to do to start using it was import a package from the Developer Portal and add another Game view matching the aspect ratio of our device to the Unity editor. If only all AR development hurdles could be resolved with a few clicks!

Joyce

When we began the programming, we decided to leverage existing examples from the Dual Render Fusion package. Time constraints meant we needed to optimize our development process, and the insights we gained from the Snapdragon Spaces mentors were invaluable. They recommended using ADB (Android Debug Bridge) to wirelessly deploy our app to the phone, allowing us to test changes without plugging in our phone. Despite sluggish internet speeds due to the high volume of participants at the hackathon, we were able to iterate on our app more efficiently.

One of the primary objectives of our application is to empower users to redesign shared urban spaces using customized 3D assets. To achieve this, we delved into asset generation, exploring options that would grant users freedom in creating their own 3D assets. We settled on Meshy AI, a platform that offers text-to-3D model AI generation. For our prototype, we provided users with preset options, laying the foundation for real-time model generation. Currently, we’re working on bridging the gap between Unity and Meshy’s text-to-model API to enable real-time model generation. This integration will allow for personalized 3D asset creation – a significant step forward in our mission to give users more control.

Sunday

Lily

After pencils down on Sunday afternoon, we entered the judging session of the hackathon. The room buzzed with energy during the initial round, as judges interacted with groups presenting their ideas. The judges approached us with stopwatches and listened attentively to our pitch. In the brief intervals, we also took the opportunity to showcase our project to other participating teams and friends who came to visit. To our delight, we received news shortly after the first round that we had advanced to the finalists.

Drawing from the experience of the initial round, we fine-tuned the delivery of our pitch, organizing it to begin with a concept introduction, followed by a user journey walk-through, and concluding with a prototype try-on. The final round of judging ran in a round-robin format, with each of the 11 judges rotating to the next group after 7-minute session. While physically taxing, this marathon of presentations allowed us to collect valuable insights from user testing on the Snapdragon Spaces dev kit employed for our application.

What’s next?

After getting so much feedback from judges, fellow hackers, and from the Snapdragon Spaces team themselves, we realized that our work wasn’t over – we want to bring Community Canvas closer to reality, as a tool for transparent decision-making in our cities. To that end, we’re refining some of our designs, researching other ways we can invite user creativity in our app, and joining other events, like the NYU Startup Bootcamp, to get more feedback on what we can improve.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.