Spotify’s Head of Deep Learning Reveals How AI Is Changing the Music Industry

'It will certainly even the playing field.'

(Photo: Getty)
(Photo: Getty)

Welcome to Future Forecast, a column that will explore the inner workings of our future lives. From sleep to artificial intelligence, I’ll consult with experts on different topics and rack their brains about the innovation that will shape our future experiences.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a rel="nofollow noreferer" href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

Today’s topic is music, which—unlike many of our subjects—is actually undergoing drastic changes all of the time. From styles and genres to devices we use to listen and how content is produced, everything music-related is constantly evolving—a phenomenon that is beginning to accelerate with the research and introduction of new technologies.

Most of us are unaware of this behind-the-scene job, but the industry is inching more and more into the world of artificial intelligence and machine learning. The New York Observer talked with Nicola Montecchio, a music information retrieval scientist who is the head of deep learning for Spotify, who detailed how the company is using deep learning to improve the way we find and listen to music.

What’s your role at Spotify and how is the company utilizing deep learning?

My job here is to apply machine learning techniques to the content of songs. We’re trying to figure out if a song is happy or sad, what are similar sounding songs and things of that sort. The characteristics that we associate with songs are subjective, but we’re trying to infer these subjective characteristics of songs by focusing on the acoustic elements themselves without considering the popularity of the artists.

How does that better user experience?

It’s figuring out users’ interests by seeing what else they and others are interested in, and it’s been working well.

If a new song comes on the platform from an artist that’s not popular, it would be hard to associate that with you using more traditional methods because no one else is listening. But when we rely on the acoustics of the song, we can make better suggestions and direct users to lesser-known artists.

So, is introducing new music to users part of Spotify’s mission?

I would say so, yes. We try to make you discover new music. If there are some unknown artists that sound good and are similar to what you like, we think you should listen to that.

How were you able to curate music in the past? How successful was it?

We already did this some without deep learning, but this allows us to go more indepth with the understanding of the song. It’s a bit more flexible in a way. We had an intern last summer who did a really nice job, and his idea was to map the acoustic elements of a song to the listening patterns. He was trying to predict what you will listen to using deep learning. It’s also interesting because you can’t predict the popularity from just the audio, so you’re also predicting something about the humanality of it.

How has deep learning allowed Spotify to grow?

On my side, I think it brings us a lot in terms of accuracy. Then of course we’re serving this into the recommendation engine. It’s surely enabling us to be more diversified—less tied to popularity and more tied to what the song actually sounds like.

How will this technology change the way we find and listen to music in the future?

It will be more about the sound. You’ll be able to search by the content of the music instead of just text information associated with it.  This means you can search for truly similar sounding music by actually searching the sound itself instead of doing what we currently do—search a title, image or artist. That’s what I think will come to the table that’s significantly better than what’s out there.

How will this affect the music industry?

For sure it’s a way to make some other artists more visible that you might not have heard of otherwise. To say how much listening patterns could change is a little bit of a harder of a question, but it should make the field more even and less biased. It will certainly level the playing field.

Spotify’s Head of Deep Learning Reveals How AI Is Changing the Music Industry