TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%
Tech Life

Meet Today’s Newest Hitmaker: Artificial Intelligence

Oct 2nd, 2016 1:00am by
Featued image for: Meet Today’s Newest Hitmaker: Artificial Intelligence

The past few years have seen a number of significant leaps for artificial intelligence making inroads into creative domains once seen as purely human. Recently, we’ve seen artificially intelligent machines writing self-referential literature and even besting human champions at board games once thought too complex for computers to master.

Now AI is taking on on another fickle human art form: The Top 40. Researchers at Sony’s Computer Science Laboratory (CSL) in Paris recently presented two pop-styled songs written by its AI-driven music-making program. The people behind CSL’s Flow Machines research project calls the FlowComposer software an “intelligent assistant,” intended to be part of the next generation of creative authoring tools that will help people interactively compose new songs and texts in any style, using machine learning techniques.

So despite some initial confusion about whether how much AI was involved in making these compositions, apparently there was still a human in the loop: the researchers collaborated with French composer Benoît Carré, who used the FlowComposer software to first select a certain style he wanted to work with.

Carré elected to create one song in a style similar to that of the Beatles, titled “Daddy’s Car,” and another more jazz-influenced song that emulated the sounds of Duke Ellington and Cole Porter, named “Ballad of Mr. Shadow.”

The software was then used to analyze a database of different musical styles and composers, populated with over 13,000 lead sheets — a form of musical notation that describes the components that make up a song, such as the melody, lyrics and harmony. A new musical arrangement was generated by the AI, and Carré worked with this new material to add lyrics, polishing it with mixing and final production.

Turning Style into a Computational Object

It’s a pretty impressive result that echoes the historical styles without sounding like a blind imitation, all done under the hood with the aid of algorithms. FlowComposer leverages machine learning techniques like style transfer, optimization and interaction to process data from different music styles, in order to synthesize a completely new composition.

“The Flow Machines project takes a computer science perspective on style: how can a machine understand style and turn it into a computational object —  an object that users can manipulate to create new objects with their own constraints?” noted the FlowMachines research team online. “Conceptually we are starting to build authoring tools in musical composition and text writing that enable people to generate content by manipulating the style of an existing author, possibly themselves.”

To do this, “style” is modeled as a “faithful and flexible” computational object, so users can mold and apply it to new circumstances. The team used a machine learning model known as Markov Constraints, which combine the imitative capabilities of Markov chains with the power and flexibility of combinatorial optimization.

One can mathematically represent the corpus (body of work) of an artist in a Markov Matrix (a special probability grid of numbers), and calculate new sequences of data that imitate this body of work. The problem is that this technique cannot be used to control the structure of the output, so the resulting variations on the style can be quite off.

flow-machines-3

The FlowMachines team solved this problem by using a more complex approach, reframing these Markov processes as “constraint satisfaction problems,” ie. Markov constraints, allowing for more control over the generated content.

With this approach, users can apply a certain “style” to any user-defined constraints, and tinker around further with the resulting content. For example, you could apply the musical style of the Beatles to lyrics reminiscent to that of Bob Dylan’s, Madonna’s, or even that of users themselves.

flowmachines-2

Enter the Flow

The goal here is to use machine learning tools to help engender a sense of creative “flow” for users. The FlowMachines concept is based on the idea that there are transcendental, psychological “flow states” that humans enter when they are completely absorbed and focused on something that they are doing. It’s when we are “in the zone” that our imaginations feel free, and everything seems possible — and it’s this state of unbridled human creativity that AI software like FlowComposer hopes to reflexively tap into and expand.

Of course, all this raises the question of what ‘creativity’ is once it is boiled down to algorithmic components. But perhaps these kinds of programs are no different than using a graphics editing software.

Creativity is a sign of intelligence, and modelling that in machine terms may allow us new insights and new avenues into the human creative process. For now, we’re already getting a faint glimpse of what these creative assistants could help humans do: explore the realm of ideas on a wider scale and augmenting that creative leap into worlds unknown. It’s an exciting time.

Images: FlowMachines

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.