Musicscape for AR


Using mixed reality to spatially visualize your music taste





Above: a recording of the project through the display of a Hololens 2 device with phone camera

Overview


Scope
2 weeks
Solo project

Tools
Interaction design
MRTK
Figma
C#
Unity
Over the span of this project, I brought an existing project to an AR space. Because I was able to rapidly prototype interactions and gestures in 3D space, I gained a lot of understanding about challenges and opportunities in XR. 

This is a mixed-reality app that visualizes your music tastes on Spotify by abstract qualities such as mood, acousticness, and energy. Each song is a single dot that’s plotted on a three-dimensional graph according to these qualities.

Most importantly, I learned how people navigate through a virtual space and how to create effective spatial affordances. Please reach out if you’d like to talk about this in-depth. 






Orienting the user

Feature highlight


Positioning gesture
Because this is a 3x3 meter fixed-size experience, I want players to be able to rotate and position the area that they will be walking around in. I played with multiple modes of orienting the playspace– this ranged from experimenting with tracking different body parts, experimenting with drag and drop, and creating virtual guides. One challenge that I ran into was that the Hololens had a limited FOV: it would have been hard to see the boundaries of the playspace as you are setting it up. That’s why I created guides to help the spatial orientation in this interaction.

Designing the pinch gesture
I chose a pinch because it was a gesture that could be made without moving the rest of the hand, which was being tracked to determine the position of the playspace. In the first iteration of this feature, you didn’t need to hold the pinch gesture. As a result, I constantly accidentally pinched and placed the playspace. After adding the circular indicator, this made the entire experience feel more polished and easy to use.  



Playing music

Feature highlight


Dynamic size
I wanted to show that songs are playable on touch, so I made the songs grow in size as you put your finger close to it.

Popping animation
The song pops like a bubble if you press it. I played with different animations, and experimented with different animation curves that would feel like a real “pop”. 

Displaying song data and background
When you play a song, tags describing the qualities of the song pop out. This allowed users to understand the positioning of the songs, which was a much needed iteration over the “Legend” feature. (see the August 21st version) One challenge was that the displayed information was not visually accessible + difficult to read on the visually noisy background of AR, as the Hololens only adds light and cannot subtract lightness to the layer of the world.






Process, prototyping, and rapid iteration

In this project, I learned how to use MRTK and Unity as a tool for rapidly getting feedback on the usability of my designs. These tools are valuable because they allowed me to think more thoroughly about design details. 

Minor but important tweaks are critical to usability. MRTK proved to be a valuable tool in the design process, as it allowed me to rapidly test out my ideas within seconds.  

Over the span of this project, I pushed myself to explore and refine concepts. Here’s a list of some of the iterations that I shared to get rapid feedback.  


August 16
Initial iteration
August 21
Legend
August 22
Legend iteration
August 23
Popping
August 27
Popping iteration



I have multiple upcoming projects that can be defined as “XR”-related. I’ll add them here as I continue learning and exploring in this space. In the meantime, feel free to reach out if you’re interested to chat more about this space!