Oracle has invented what it feels is a better way of passing machine learning (ML) data across a network, and it is sharing this protocol with the world.
The newly open-sourced network protocol, called GraphPipe, was designed to simplify and standardize transmission of machine learning data across a network. It addressed an element often lacking in today’s ML systems, the ability to quickly deploy models and make requests to them.
“There’s a been a huge amount of effort put into machine frameworks to build machine learning models, but when it comes to deploying that machine learning model into production, there’s a not a lot out there,” said Vish Abrams, an Oracle architect for cloud development, and lead of the GraphPipe project.
One of the challenges is that a lot of machine data sets are formatted into tensors, a data model that describe the geometric relations across vectors and scalars (and other tensors). The format can be particularly thorny to prepare for mobile and other Internet of Thing devices, which are often limited in computational power.
Specifically, there is no standard for the data that you get from a model serving API. In most cases, the data may arrive in JSON data, or from the binary-based protocol buffers (Protobuf). Because tensor data involves floating point data, the text-based JSON is particularly ill-suited as a transport mechanism. Protobuf is faster, but the implementation details of the model are left to the user.
Or, if the data isn’t packed in one of those two formats, it may come in some unique format that forces the developer to write code to translate the data into something usable. And chances are, the format is not very conducive to efficiency, slowing the performance of production application.
“The place is where efficiency becomes important is where you are building distributed models, either a reinforcement learning pipeline or an ensemble of a bunch of models being used together,” Abrams said.
Graphpipe is based on FlatBuffers, a fast generic binary format that speeds deserialization by doing away with copying the data across memory. FlatBuffer definitions provide a request message that includes input tensors, input names and output names, which then can be digested by the GraphPipe remote model, which can work across different languages, as well as with multiple ML frameworks.
The GraphPipe open source package includes:
- A set of FlatBuffer definitions
- Guidelines for serving models consistently according to the FlatBuffer definitions
- Examples for serving models from various machine learning frameworks
- Client libraries for querying models served via GraphPipe, including those for Python, Go and Java
- A plugin for TensorFlow that allows the inclusion of a remote model inside a local TensorFlow graph.
Oracle is not alone in addressing needs in the ML community. Earlier this week, GPU maker Nvidia introduced a new architecture, called Turing, that was designed to natively work with tensor data.
The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: turing.