When it comes to machine learning and artificial intelligence, can the student become the teacher?
I use Google Play Music a lot to listen to music. Google’s algorithms learn my taste in music, based on what I listen to most often. It even knows what I tend to listen to at different times of day and different locations. It’s very smart.
Occasionally, I use the “I’m feeling lucky radio” option, which is essentially a random playlist based on my musical tastes (rather than artist, genre or mood-based). If I like a song, I’ll give it a thumbs-up so Google learns more about my tastes and can curate better playlists for me.
But that got me thinking…
The songs Google selects for me are often songs I’ve never heard before. If I like them, it will serve me similar songs in the future. But, since I’ve never heard most of these songs before, at what point is Google training me to like a particular type of music? Can Google change my taste in music by slowly introducing new bands that are similar — but slightly different — to others I’ve liked in the past?
Overtime, through very subtle changes, could Google train me to love instrumental metal when I’d previously been really into ambient bass?
I’m not suggesting the AI would do this intentionally, of course, just that a slow drift could occur. I give a thumbs-up to Band A. Google then plays Band B, which sounds a lot like Band A, but is slightly more metal. I like them, too. And then Band C is still pretty similar, but even more metal. And then Band D is still pretty ambient bass-y, but has still more of a metal influence than the previous bands, and so on down the line.
You see where I’m going with this.
What do y’all think? Could the machine that’s supposed to be learning my tastes actually change my tastes in the process?