In the world of academia, there’s a lot of fascinating research going on in developing algorithms, those foundational sets of instructions that computers follow in order to perform tasks, process data or solve a problem. Algorithmic intelligence underpins many of the services we take for granted today, from search engines to other new, emerging technologies, like algorithms that help driverless cars recognize pedestrians, algorithms for music recommendations, and even algorithms that help machines learn how to learn.
But the flip side is that many of these awesome algorithms hatching out of the brightest minds in academia generally don’t make it out into the wider world. Besides being presented at the conventional circuit of conferences or being published in an academic journal, researchers’ innovative algorithms often have a hard time finding their way into the hands of developers and small businesses, where they could be parlayed into new, wide-ranging and potentially useful applications.
The New Algorithm Economy
All that is slowly changing, however, with the emergence of the so-called algorithm economy, where developers can produce, distribute, and commercialize their code through online marketplaces, and where other developers, businesses and organizations can also then easily discover, select and stack different algorithms to create different applications.
One such online marketplace is Seattle-based startup Algorithmia, which provides algorithms-as-a-service, connecting academic researchers and their algorithms with the developers and businesses that want to use them. Algorithmia not only hosts and distributes trained deep learning models using a cloud-based platform but also makes these lines of code easier to integrate and accessible to all by seamlessly uniting everything across a simple REST API. Creators of the original algorithms can also benefit from royalties, in the sense that they are paid a portion of the profits every time their code is called up and used.
“Engineers develop new algorithms on a daily basis, but what normally happens is that they write a paper about it, get it published and move on,” Algorithmia co-founder and Chief Technology Officer Kenny Daniel tells USC News. “These algorithms don’t make it out into the world, where they could actually benefit people. There’s a huge supply of algorithms and a huge demand for them; they just aren’t meeting. I saw this as a perfect opportunity to build a market.”
Algorithms as a Microservice
Algorithmia’s approach is to containerize algorithms, packaging them as microservices and hosting on their scalable and serverless cloud infrastructure, and which are called up through their API, via a few lines of code.
“By containerizing algorithms, we ensure that code is always ‘on’, and always available, as well as being able to auto-scale to meet the needs of the application, without ever having to configure, manage, or maintain servers and infrastructure,” says Algorithmia co-founder and CEO Diego Oppenheimer in a recent blog post. “Containerized algorithms shorten the time for any development team to go from concept to prototype, to production-ready app. This structure makes software development more agile and efficient. It reduces the infrastructure needed, and abstracts an application’s various functions into microservices to make the entire system more resilient.”
Since the company’s launch in 2013, it has amassed over 2,200 algorithms from 19,000 authors — some of them hailing from the finest research universities in the world — in its online libraries. The company is now hosting and distributing trained deep learning models (over a dozen of them are open source) using GPUs in the cloud. Caffe, Theano, and TensorFlow deep learning frameworks are natively supported, with support for Torch and MxNet in the works.
One recently released microservice tool uses a computer vision deep learning algorithm that has been trained on a million images to automatically colorize black-and-white photos. As seen in the before-and-after photos below, the tool works pretty well with distinguishing between landscapes and subjects. There’s a demo here where users can paste the URL of an image to test it out.
Hosting deep learning models using GPUs in the cloud has not been without challenges, says Daniel. “We’ve had to build a lot of the technology and configure all of the components required to get GPUs to work with these deep learning frameworks in the cloud. The GPU was never designed to be shared in a cloud service like this. There are driver issues, system dependencies, and configuration challenges. It’s a new space that’s not well-explored, yet. There’s not a lot of people out there trying to run multiple GPU jobs inside a Docker container.”
It’s this approach that allows algorithms to become composable, interoperable, and extensible elements that can be recombined at scale, adds Daniel: “We’re dealing with the coordination needed between the cloud providers, the hardware, and the dependencies to intelligently schedule work and share GPUs, so that users don’t have to.”