Computers Making Music: A Progress Report in the Form of a Mix Tape

Several high-profile experiments are using artificial intelligence algorithms and neural networks to try tackling the ultimate task: creating some beautiful music.
Let’s give a listen to some of their results so far…
Just last month scientists at Kingston University London and Queen Mary University of London trained an AI on 23,000 pieces of Irish folk music — and created a folk music-generating algorithm that they named “Bot Dylan.” You can hear some of the results in a clip embedded in this article in the Daily Mail. In fact, the AI generated over 100,000 new songs, and “we, and several other musicians we worked with were really surprised at the quality of the music the system created,” announced a senior lecturer in music technology at Kingston University in London.
“The fact of the matter is, technology and creativity have been interconnected for a long time and this is just another step in that direction.”
But they’re just one of many impressive groups who are working on AI-generated music. Last summer Google also released a catchy 90-second piano melody that was generated by a trained neural network.
And IBM’s Watson helped the Grammy-winning producer Alex Da Kid record the song “Not Easy” with X Ambassadors, Elle King and Wiz Khalifa. Mashable reports that Watson analyzed lyrics from thousands of songs — as well as their chord progressions, and even what key they were in, to create “an emotional fingerprint of music by year.” For good measure, it also crunched five years of texts to determine the culture’s most popular themes — everything from Supreme Court rulings to the most-edited Wikipedia articles — then extract popular reactions to those themes on Twitter, blogs, and in news articles using Watson Tone AnalyzerK.
Meanwhile, the European Research Council is also funding an experiment at Sony’s research center in Paris to explore new ways of creating music with algorithms. Its massive “data sets” are songs, which it analyzes for style and tricks for optimization. In September the group revealed the algorithm’s stunning attempt to create a song in the style of the Beatles — a mishmash of luscious harmonies that was exactly three minutes long.
Of course, the AI had a little help from its friends. French composer Benoît Carré was brought in to create an appropriate arrangement for the computer’s tune, as well as contributing some appropriate lyrics (and some top-notch production values). As a follow-up, the same group also created a song “in the style of American songwriters,” consuming nearly a century of vintage classics by Irving Berlin, Cole Porter, George Gershwin and Duke Ellington. The result? “The Ballad of Mr Shadow”
Now the AI’s working on an album. Seriously. There’s already five tracks available for purchase on Soundcloud, with trippy titles like “Man Wind” and “Son#2 (from sixties).”
But where is this all going? Well, for one thing, a startup named Jukedeck even wants to make a business out of it. “We’re training deep neural networks to understand how to compose and adapt music so that we can give people the tools to personalize the music they need,” explains its website. Customers get to choose what style — and instruments — they’d like for their custom-generated music, as well as how long the piece should run, and even when it should reach its dramatic climax. “With origins at Cambridge University, we’re a team of composers, producers, engineers, academics and machine learning experts with a shared passion for music and technology.”
Eighteen months ago the startup won the TechCrunch Disrupt competition, and since then people have used its site to create more than 1 million tracks. Its music has also been used in “tens of thousands” of YouTube videos, “from cat videos to massive YouTube stars.” At one point, even Coca Cola used one of its songs.
“Jukedeck charges as little as 99 cents a track for a small business and $21.99 for a large business,” reported All Things Considered, adding that with millions of videos being uploaded to YouTube and Vimeo, Jukedeck’s business model “may be right for this moment… Many would like to use music but can’t afford large sums for rights or composition.”
“We wanted to make it as simple as possible,” added the company’s founder — a composer who also studied computer programming who says he wants to “really democratize the process of creation.”
Is this the culmination of a process that’s been building for years? The article also points out that humankind has been attempting this feat for a long time. In 1957 two professors at the University of Illinois at Urbana-Champaign wrote a four-movement string quarter using computers which is still considered the first computer-generated score.
The article even dares readers to compare computer-generated music to the real things. There’s a snippet of classical music composed in the 1,700s by Vivaldi — and then a snippet generated four years ago by a Vivaldi-imitating algorithm. The artist even uploaded it to YouTube with a video that was generated by an algorithm.
That algorithm was created several years ago by David Cope, a professor emeritus at the University of California in Santa Cruz. But in fact, algorithms were actually used to generate music hundreds of years ago. A computer science lab at a university in Milan remembers an “automatic composing” technique that’s been attributed to both Mozart and Haydn. Sometime in the 1,700s, musicians used a sheet of music with six different musical staves, and they’d roll a six-sided die to determine which of the staves to play first. And then rolled again for the second pass through the sheet music, and so on…
“While it is argued that Babylonian clay tablets of the 13th century B.C. describe algorithms for harmonization, and it is likely that earlier unknown manuscripts (e.g., 15th century Flemish) may contain composing algorithms, it can be stated that Mozart and Haydn were among the first ‘composers of computer music’, as if to indicate that the desire of applying computer technology to music is probably as old as music itself,” the site asserts.
Back at U.C. Santa Cruz, professor emeritus David Cope still insists that more could be done with computer-generated music if it weren’t for the fact that the whole concept makes audiences (and producers) uncomfortable. “On the credits, they don’t want to see ‘Composed by Computer Program Experiments in Musical Intelligence by David Cope.'” But he insists that eventually, movie soundtracks and commercial jingles will be generated by computers.
Maybe startups like Jukedeck are helping to move that process along. It recently celebrated its success — and the blossoming phenomenon of AI-assisted music generation — with a video of its own that celebrates the whole history of technology innovation.
WebReduce
- Pioneering COBOL designer Jean Sammet passes away at age 89.
- Better than 3-D? A reporter watches Wonder Woman in 4DX.
- Filming Formula One racing with a 104-year-old camera.
- CNN commemorates the tech products killed in 2016.
- The Department of Energy explores real-world avatars to reduce the need for traveling.
- Drone demo delivers doughnuts in Denver.
- Researchers can “photograph” people through walls using Wi-Fi signals.
- The O’Reilly podcast looks at Open Source cities.
- What happens when we can program matter?
- New York Times Sunday magazine experiment: an all comic strip format.
- NASA discovers we were wrong about Jupiter.
- A business school professor argues there’s no such thing as big data in HR.
- How an Atari co-founder launched Chuck E. Cheese.
- Google plans a new 1 million-square-foot building in London.