In the Magazine Tech & Science

Spotify, IBM and Google Using AI to Make Human Musicians Extinct?

PER_Maney_01_909766786
Elton John announces "Farewell Yellow Brick Road" tour dates at Gotham Hall on January 24, 2018 in New York City Kevin Mazur/WireImage/Getty

Technology threatens to make human musicians about as essential as draft animals. Artificial intelligence is showing it can render songwriters moot, and now it seems that a robot Elton John loaded with AI could actually be capable of going on tour after the real one is gone, performing new songs the Elton-bot instantly writes based on the day’s news.

Musicians are already reeling from the shift to streaming music, which pays artists a fraction of what they used to make on CD sales. Global music industry revenues have been cut in half since the peak of CD buying in the late 1990s. Yet even though pay is down, artists have been able to feel that the world still needs people to write and record new music. Now that’s in jeopardy too. Spotify, IBM, Google and a smattering of startups, such as Jukedeck, are working on AI-driven song creation. By next decade, a Grammy for song of the year might go to a piece of software.

Over the summer, Spotify hired Francois Pachet, an AI scientist who led a computer science lab for Sony, to run its new Creator Technology Research Lab in Paris. Why would a music streaming service do this? Well, if a Spotify computer can pump out hit songs by virtual artists to play on Spotify, the company can make a ton of money without paying royalties to human artists.

That’s not the way Pachet frames it, though. He’s careful not to sound like some kind of musical Dr. Frankenstein. Instead, Pachet talks about building AI that can be a songwriting partner, “like the Lennon that needs a McCartney,” he said in one interview. “We are building companions and collaborators who are smart enough to give good ideas to humans, but they are not sufficient to create it all on their own.” As an example, Pachet and a group of European artists instructed AI software to write a song and create instrumentation that mimics the Beatles. Humans wrote the lyrics and refined the arrangement. The result, a song posted online called “Daddy’s Car,” sounds as if the Beatles got hired to write a jingle for a chipper Mentos commercial.

IBM is similarly positioning its Watson Beat technology as a helper. The AI produced a song in collaboration with a real artist, Alex da Kid. The Watson project comes across as scarily calculated. To create the song, the machines sucked in the lyrics of more than 26,000 Billboard Hot 100 songs and analyzed the music to find patterns among the keys, chord progressions and genres. The goal was to come up with “emotional fingerprints,” as IBM calls it, that made the music popular. To figure out the connection between a hit song and the zeitgeist around it, Watson analyzed New York Times front pages, Supreme Court rulings, Wikipedia articles, blogs, tweets and the plots of popular movies. The resulting song, “Not Easy,” mostly shows that AI can’t yet guarantee a hit.

PER_Maney_02_910353180 Symbols and control panels for sound recording equipment Ikon Images/Getty

Yet just because AI hasn’t written a hit doesn’t mean it won’t, and the main reason is the sheer number of songs an AI can write and produce. The Beatles recorded 237 original songs; Michael Jackson 137. An AI could do that in the time it takes McCartney to press one piano key. Once an AI like Watson learns from ingesting all those songs and lyrics and cultural signals, it’s trivial for it to produce millions of songs. You know the old saw about how an infinite number of monkeys typing on a keyboard would eventually produce another Fifty Shades of Grey? Well, set up an AI to create tens of millions of songs, and one of them is going to be a “Bodak Yellow.”

In this era of streaming music and YouTube videos, artists have turned to live performances to make most of their money. At least that seems safe for now. An AI might create a hit song, but it would be pretty boring to watch a blinking computer play it onstage.

But Elton John might help change that too. As he begins his final tour, he is working with creative agency Spinifex and AI company Rival Theory—with an assist from Google’s virtual reality team—to create his “post-biological” self, as he’s called it. (Rival Theory’s tagline is “AI personas based on real people.”) The plan is to digitize and feed the AI everything Elton—songs, images, videos of performances, interviews—to develop a nearly sentient virtual version of him. A goal is for this virtual artist to continue touring. You’d be able to watch a fresh “live” Elton John concert through virtual-reality goggles.

If you put together all the advances in technology, you can see where this might be going. Spice up the AI Elton John—or Jay Z or Dolly Parton or any artist—with some songwriting AI, and it’s probable that the long-dead artist could write and record new songs. Add the advances in robotics, and maybe you won’t have to don goofy goggles to watch a post-Elton perform. You could go see a robot version play Madison Square Garden, writing a new song on the spot about President Oprah Winfrey.

Does all this mean musicians should just give up and take jobs in an Amazon warehouse? Hard to say. Technology throughout history has often challenged musicians. You can go back to the 1920s and find a panic over the “free music” being broadcast on that era’s new wonder called radio. Napster in 1999 seemed like it might wipe out the whole industry. Musicians kept adjusting. Maybe once the public gets flooded with AI music, human-made music will become a premium product, like handcrafted jewelry on Etsy.

And maybe AI will just turn out to be a terrible application of technology to music, and we’ll reject it. We’ve seen that before. Remember Auto-Tune?

Join the Discussion