Modal Title
Machine Learning

How Facebook’s Open AI Research Uses GPU Neural Networks

Jan 31st, 2015 11:00am by
Featued image for: How Facebook’s Open AI Research Uses GPU Neural Networks

How to work with big data is a fascinating problem. While much of the current fascination with massive data sets is focused on the ability to extract value from historic data, it’s also an important tool for building the training data that let us create and use machine learning systems to work with the ever increasing flow of data from sensors, from users, and from the ever-growing social networks.

It’s not surprising to discover that Facebook has an AI research group, FAIR, that’s focused on these problems – and that’s also set up to share the tools and technologies it develops. Papers are published on open access sites, like arXiv, and tools and software FAIR develops are open sourced.

At the heart of Facebook’s AI research is an open source numeric computing tool called Torch. Developed by scientists at Facebook, Google and Twitter, along with contributors from academia and from industry, Torch uses the Lua scripting language to drive C code — with support for NVIDIA’s CUDA general purpose GPU programming environment.

It’s Torch’s CUDA support that’s most interesting, as it lets you use the parallel processing tools built into a modern GPU as the fabric for running large scale machine learning systems. Instead of having to build massive clusters of computers or virtual machines, Torch can work against clustered GPUs to give researchers desktop supercomputing capabilities. You’re not limited to the capabilities of GPUs, either, as you can use Torch to drive custom FPGA systems for more complex problems.

Torch is designed to make it easy to quickly build scientific computing algorithms, making it easier to test new ideas and new ways of working. You’re also able to work with widely-available software packages developed by the Torch community; among them tools for computer vision, for signal processing, and for neural networks. It’s that neural network support that makes Torch ideal for machine learning, as you can use it to build networks of neural nets that can be run in parallel over both CPUs and GPUs.

FAIR recently open sourced a selection of Torch modules for handling deep learning, many of which are designed to work with NVIDIA GPUs. Several of the new modules are designed to help build natural language processing tools – an area of research that should help Facebook understand user content, as well as giving it access to a new generation of natural user interfaces.

Perhaps the most significant part of this code release is a new Fast Fourier Transform convolution layer. Convolution is a complex mathematical processes, where two functions are used to create a third, often converting from one domain to another. A fast Fourier transform is a commonly used convolution algorithm that is used to convert time to frequency (and vice versa), making it easy to analyze complex signals – like pictures or speech.

Training neural nets to handle convolution can be slow, so speeding up the training process makes a lot of sense as it lets researchers try out more algorithms and more code. FAIR’s new FFT algorithm is much faster than other code, with its blog post claiming a speed up of more than 23x when compared to the fastest public code. Speeding up FFTs and neural nets like this is going to make it a lot easier for researchers to try out new algorithms. Similarly tools that make it easier to use neural nets by automating deployment across multiple GPUs will speed up processing complex data sets.

By open sourcing its machine learning tools like this, and by using an open framework for numeric computing, Facebook is making it a lot easier for artificial intelligence researchers to share and improve techniques and technologies. With faster neural nets, improved signal processing, and better natural language processing techniques, there’s a lot for AI researchers to take advantage of, and more for Facebook to build into its service.

With over a billion users, and petabytes of data on its servers, it makes sense for Facebook to invest heavily in AI research. Using AI to understand how social networks behave is key to surfacing relevant content in users’ timelines – especially when using it to identify content in streams of images and video, and for working with natural language processing as part of a shift to speech inputs from mobile devices. Sharing its tools and technologies with the research community should speed up development – and make it easier for Facebook to deploy these techniques in its public facing service.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.