BaRT, Spotify’s personalised recommendation algorithm

You may have never heard of BaRT or you may not know who or what that is (we’re not talking about Bart Simpson here). But if you use Spotify to listen to music or podcasts, then BaRT most probably knows a lot about you.

Bandits for Recommendations as Treatments (or BaRT, for short) is the algorithmic system used by the music and podcast streaming company Spotify to offer personalised recommendations to its users.

If you’ve ever used Spotify (at the time of writing, there are more than 400 million Spotify users worldwide), then every time you look for a particular song or a band or a podcast, every time you play a song you like, every time you add particular songs or podcast episodes to a playlist, every time you click on “next” before a song finishes… BaRT has been there in the background trying to learn about what you like and what you don’t like.

And not only that, but BaRT also tries to learn more about you. For example, whether what you like and dislike depends on what time of the day it is, or on what day it is, or what time of the year. And, crucially, BaRT also tries to learn what else you may like: what kind of content that’s unknown to you could be of your liking.

In effect, BaRT is a personalised curation and recommendation algorithm. Its job is to learn as much as it can about you so that it can suggest to you songs, bands, podcasts and playlists that you will like and that’ll make you keep using Spotify – including new bands and new songs and new podcasts that you still don’t know.

Like Facebook and other social media platforms, Spotify has also run into problems about such automated “personalised” suggestions, especially in the case of podcasts, with users complaning of inappropriate recommendations, like a podcast about pornography.

Because it’s the algorithm who automatically ranks songs and podcast episodes and decides in which order they get recommended to users, that generates an incentive for music and podcast producers to try and beat or hack the algorithm by making their product about what gets promoted the most, which in some cases can be explicit or sensationalist content.

Additionally, and as Spotify generates a big part of its income through advertising, that means that Spotify also has the incentive to promote content that will attract and keep the largest number of users on its platform.

One problematic effect of this business model could be seen last January, when Spotify employees and musicians complained that Spotify was promoting a podcast, The Joe Rogan Experience, that delivered misinformation about Covid-19. Several bands removed their music from Spotify because the company didn’t take action about Joe Rogan, whose podcast is offered exclusively on Spotify and has a very large number of active listeners, which translates in high fees being charged by Spotify to advertisers.

While offering personalised recommendations to your users may sound innocuous enough, the fact that it’s an automated process run by an algorithm may end up being problematic, because of the potentially negative incentives and effects generated by such an automatic process.

And that’s another reason for algorithms to be made auditable by external and independent parties, and there are ways for companies to let auditors have a look at their algorithms while also maintaining their intellectual property private and protected.

You can find more information about BaRT and many other algorithmic systems in the OASI Register, and you can read more about algorithmic systems and their potential social impacts on the OASI pages. Besides that, if you know of an algorithm that we haven’t yet included and which can have a social impact, you can let us know by filling in this simple online form.