Rueque

Augmented Reality Street Music Experience

Role

Experience Designer

Tools

Figma
XR Prototype Tool
Meta Quest 3

Timeline

Sept 2023 - Apr 2024

Overview

Rueque is an augmented reality (AR) concept project that enriches live street music performances by bridging the information gap between musicians and audiences.

Inspired by my own cross-cultural experiences – imagining translated lyrics floating in the air as you listen to a poignant foreigner song on a Canadian streetfile the project explores how AR can overcome language and cultural barriers in music. As the UX designer, I led the end-to-end process from initial research to prototype development.

The result is a polished AR experience prototype (tested on a Meta Quest 3 headset) that showcases a more inclusive and engaging way for people to enjoy street concerts.

How might we help audiences resonate with street music?

— especially when language or cultural barriers exist?

Problem

Street musicians often struggle to convey the stories and emotions behind their songs, especially when performing in languages unfamiliar to their audience. Unlike large concerts that use big screens or subtitles to help non-native listeners follow along, street performances rarely have the infrastructure for such enhancements. This leads to a gap in audience understanding and engagement – passersby might enjoy the tune but miss the song’s meaning or the musician’s background.

I defined this core problem as “the information gap between songs and listeners that hinders understanding. I identified a design opportunity in the street music scene: to use AR technology to provide context (like lyrics translations and artist info) in real time, so that even a casual listener can fully immerse in a song’s story regardless of language.

Group 3046

User journey map of listen to a song you don’t know

I conducted extensive research to ground my design in user needs and behaviors, combining both secondary and primary methods:

Secondary Research

I studied existing music platforms and live shows to see how they present information.

For example, some streaming apps display dynamic lyrics and even fan interpretations during playback. Big concerts often project lyrics and visuals on stage screens (especially for international tours) to keep audiences engaged across language barriers.

In contrast, street performances lack these aids due to budget and setup constraints, reinforcing the need for a portable solution like AR.

Listener surveys

Through card sorting tests, informal surveys and chats, I learned that listeners crave more context about music. A catchy melody might prompt people to dig deeper into a song’s story, and many listeners expressed curiosity about lyrics meaning and background even for songs outside their usual taste. These findings confirmed that providing additional song information could enhance enjoyment.

Screenshot 2025-06-05 at 13.02.00

Card Sorting Activity

Musician interviews

I interviewed street musicians (recruited via YouTube/Instagram) to understand their experiences and pain points.

The interviews revealed practical challenges: equipment for elaborate sound or visual setups is often too costly and impractical, so street performers rarely include any subtitle screens or special effects. The artists also described balancing authenticity with audience appeal – for instance, deciding between popular covers versus original songs – and the difficulty of engaging a diverse, transient crowd on the street.

These insights underscored that a solution should be lightweight for performers and help them better connect with passersby without diluting their art.

 

By researching both audience and musician perspectives, I ensured the project was truly user-centered. I identified two primary user groups – listeners and street musicians – each with unique needs and opportunities.

Design Process

Using the research insights, I followed a structured design process that emphasized empathy and iteration.

Primary Persona - Audience

Frame 3048
Frame 3047
Frame 3049

Secondary Persona - Musiciens

Frame 3050
Frame 30512

Building empathy through personas and scenarios helped pinpoint key pain points and moments in the user journey where intervention was needed. I mapped out the typical street performance experience from both viewpoints – from the moment an audience member notices a performer, to listening and attempting to understand an unfamiliar song, to the performer’s effort to engage the crowd.

Empathy Map

User Journey Map

Group 30521
Group 3053

This user journey mapping highlighted opportunities for design, such as providing lyrics translation when a listener starts paying attention, or offering easy ways for impressed listeners to support the artist.

Then I sketched out early concepts and storyboards for an AR-enhanced street performance. In fact, I began with low-fidelity doodles illustrating a street scene with digital overlays – colorful floating lyrics and icons around a busker – to visualize how the experience might look.

未命名作品 63 1
AR stage – Wearable Device – 3

Concept Doodles

From these sketches, I identified the core features her solution needed. Key elements included: real-time lyrics (with translation into the user’s language), information about the musician and song, and gentle visual effects to elevate the performance atmosphere. I also considered practical details like ensuring any interface is minimal and doesn’t block the performance view.

Next, I moved into prototyping.

User Flow

I categorized the scenarios of users using the product into three types: downloading and using on-site, using upon receiving a notification, and using with a specific purpose in mind.

Group 3054
Group 3055

Paper Prototype

Group 3060
Group 3061
Group 3056

Mid-Fidelity Prototype

Wireframe – 1
Wireframe – 2
Wireframe – 3
Wireframe – 4

Test & Iteration

Since XR hardware is not widely accessible, I conducted most testing using paper prototypes and mid-fidelity mockups. Test participants simulated the AR experience by holding up or interacting with printed interface components — allowing them to imagine the spatial UX without a headset.

Usability Barriers:
- Cumbersome sign-in and app download method
- Overwhelming amount of text and visual clutter during performances
- Confusion from unclear proximity indicators or navigation flow

Interaction & Engagement Gaps:
- Initial prototype lacked interactive elements like tipping or reacting
- Users requested more intuitive tip and song vote mechanisms
- Desire for an element of surprise in artist information, but with smoother updates

IMG_5457
IMG_5469

Solution

The final Rueque solution is an AR-powered companion app that transforms how audiences engage with street concerts. When an audience member uses Rueque (via XR headset or a mobile device in AR mode) at a live street performance, several features come together to enhance the experience without requiring any heavy setup from the musician:

Group 761

High - Fidelity Prototype

Using Figma for designing product windows and interfaces can be very intuitive and is almost identical to the process used for designing for flat devices. I completed all high-fidelity interface designs in Figma, ensuring that all research and test results from the previous stage were reflected.

However, due to the challenges of spatial interface design, I only selected some of the necessary functions and implemented them simply in the AR prototype design tool Bezi to demonstrate the spatial concept of this product.

Prototype on Figma

Users can explore a map to find nearby gigs and view performing musicians’ profiles. At the venue, it offers direct links to their social media like Instagram and Twitter.

Users can search street performances, revisit past gigs, follow favorite musicians, and view collected digital souvenirs.

When attending a gig, users activate the AR navigation to follow in-space arrows to the venue. This feature stays hidden until summoned. If accessed via notification, artist info remains hidden to preserve surprise.

At the performance, the interface uses minimal floating modules that appear only when triggered. Users can interact by cheering, tipping, reading lyrics, viewing song info, and voting on the next track—features to be demonstrated in the AR prototype.

AR Prototype with Bezi

I explored Unity, Adobe Aero, and Bezi for AR prototyping and ultimately chose Bezi for its balance of ease and functionality. Despite some bugs and feature limits, it allowed me to simulate Rueque’s key scenarios—AR navigation and live performance interaction—and test them on Meta Quest 3. These tests revealed challenges in spatial typography and gesture sensitivity, offering valuable insights for future refinement.

(Special thanks to Saiho Mak for performing in the prototype demo.)

Screenshot 2024-04-18 at 23.58 1
Screenshot 2024-04-19 at 00.28 1
Screenshot 2024-04-19 at 00.26 1

Reflection

Designing Rueque challenged me to bring UX thinking into the emerging world of XR. I learned to balance creative ambition with the limitations of current AR technology by making forward-looking assumptions—imagining a user-friendly, wearable AR experience that doesn’t fully exist yet.

Through testing, I discovered that spatial UX requires rethinking everything from text legibility to gesture control. What works on flat screens often fails in AR, where comfort, clarity, and interaction sensitivity are critical. These insights deepened my awareness of accessibility and usability in immersive contexts.

More than just a concept, Rueque became a rigorous design exercise—shaped by research, prototyping, and user feedback. It taught me the power of storytelling in design and showed how tech can bridge emotional and cultural gaps. Though it’s still evolving, Rueque is a meaningful milestone in my growth as a designer.

file (4)

Contact me @

Thank you for coming to my site!
Wish you have a colorful day (●'◡'●)