Spotify Wrapped dropped at the end of last month, and, in what has become an annual ritual, every Instagram story turned into a reminder of how much everyone you know listens to Taylor Swift. But, Spotify Wrapped isn’t always only a reflection of what music you like — it also reflects what the recommender algorithm thinks you might like.
Did your Spotify Wrapped statistics match what you predicted for yourself? Did you question why your music personality was ‘Vampire’? Are you booking a one-way flight to Berkeley, California (or maybe Burlington, Vermont)?
Recommender systems have an increasing influence on how people consume media, and by extension, understand themselves in relation to others. What if apps like Spotify had built-in systems to allow users to understand and influence what kind of content they direct us towards?
Dr. Cristina Conati, an artificial intelligence and human-computer interaction specialist, spoke with The Ubyssey about the importance of “understanding the characteristics that the system should use to personalize.”
How Spotify generates data
Spotify focuses on two things: content and users. Spotify’s algorithms learn how to categorize its content by analyzing the specific qualities of songs (content-based filtering) and in relation to others who have listened to similar songs (collaborative filtering).
Its content-based filtering analyzes songs based on a matrix of factors including tempo and rhythm, but also lyrical analysis and a metric called “valence” that attempts to classify the “positiveness” of a track.
Spotify then looks at how the user — you — interacts with the music. It considers your active behaviour of liking songs and adding it to playlists, and even behaviour such as looping a particular song (for example, I listened to "Fall In Love With You" by Montell Fish 352 times this year).
If other users listened to the same song repeatedly, then the algorithm might recommend you other songs that worked for them.
Weighing your choices across the available content and against other users, Spotify then generates ‘Made for You’ playlists and personalized mixes, which shape your overall listening activity and ultimately, your Spotify Wrapped.
Transparency in recommender systems
This combination of content-user interaction to suggest other discoveries is not unique to Spotify, but rather a common thread in all recommender systems. Recommender systems are the AI that allow apps like Netflix, TikTok, Amazon and more to suggest content based on your preferences. They guide Tinder to reorder your potential matches, influence which reel you see next on Instagram and decide what comes on YouTube homepage.
Often, users don’t understand how these systems work, or where the data collected from their activity is going.
Some users and artists are calling for more transparency from apps like Spotify about how it analyzes us. Transparency is the degree to which the system's inner workings and decision-making processes are understandable and interpretable by users. Some of the methods through which Spotify obtains its metadata rely on indirect methods, such as user clicks. For example, if you are playing ‘Baby Shark’ on repeat for your little cousin, it doesn’t reflect your music taste.
This situation illustrates one such situation where the user would benefit from transparency and discern what the recommender system is taking into account.
Millecamp Martijn, Conati and Katrien Verbert co-authored a paper last year on recommender system transparency. The paper suggests that providing explanations for why the system thinks you would like its recommendations could increase user trust while helping algorithms perform better.
A 2023 study found that more control over recommender systems and transparency might benefit artists too. From interviews with artists and industry professionals, researchers found that allowing users to input more specific recommendation preferences could allow people to choose to be exposed to more music by new artists or by underrepresented groups in the music industry.
But, just as people have different music preferences, they likely have different preferences about how much they want to know about how their algorithms work behind the scenes.
“We should also personalize the explanations because not every user is the same,” said Conati.
Using psychological tests, their project evaluated users based on three categories: their desire to be mentally engaged (need for cognition), musical sophistication (musical skill and expertise) and openness to new music. The user’s evaluation could then be used to tailor explanations to their preferences — from long detailed explanations to brief blurbs to graphic charts.
“I might be one with a very high need for cognition. I need very deep content. So the implications could be long and technical. Others might [not] even want an explanation as long as the recommendation makes sense," said Conati.
Navigating the expanding influence of recommender systems across platforms, it is important to strike a balance between the benefits of personalization and the users' right to understand the mechanisms shaping their digital encounters.
Spotify's metadata generation, which relies on both explicit and subliminal user interactions, highlights the need for transparency in algorithmic decision-making. Conati's research emphasizes the importance of personalized explanations to enhance transparency, acknowledging the diversity among users.
She also recognizes the limitations.
“Not everybody wants to have the same answers, because we have different needs. On the other hand, if the personalization is not done well, it can be very obscure."
Would Spotify Wrapped look different if people were aware throughout the year of how their listening habitats were being categorized and manipulated? While the statistical outputs would have been the same, the information presented would have been tailored to suit the user’s preference. I, for one, am curious to know more about how the recommender system decided the place that had the same music taste as me.