How has the recent turmoil within the OpenAI offices changed your plans to use GPT in a business process or product in 2024?
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
No change in plans, though we will keep an eye on the situation.
With Sam Altman back in charge, we are more likely to go all-in with GPT and LLMs.
What recent turmoil?

Graphcore’s Vibrant Computational Images Show Artificial Intelligence in Action

Mar 12th, 2017 6:00am by
Featued image for: Graphcore’s Vibrant Computational Images Show Artificial Intelligence in Action

It can be difficult for those interested in the evolution of artificial intelligence but don’t have a background in related fields to wrap their minds around the abstract concepts surrounding it. Terms like convolutional neural networks, Bayesian networks and Markov chains sound like almost esoteric-sounding ideas, but these are some of the machine learning techniques being used today for many useful applications we are beginning to take for granted, such as image and speech recognition, medical diagnostics and predictive text generation. But this obtuseness gets a little clearer when one is able to literally see the ‘big picture’ of how these algorithms work from a visual point of view.

Using a new processor technology designed for artificially intelligent systems, Bristol-based startup Graphcore used its Intelligent Processing Unit (IPU) to create these stunning images of what the algorithms in a machine learning model look like when they are in action.

“Unlike a scalar CPU or a vector GPU, the Graphcore Intelligent Processing Unit (IPU) is a graph processor,” explained the company in a blog post. “A computer that is designed to manipulate graphs is the ideal target for the computational graph models that are created by machine learning frameworks.”

These false-color images we see here are actually computational graphs. In mathematics, graphs are data structures that show the relationships between vertices, nodes, points as connected by edges, arcs and lines, much like how a diagrammatic map of a human brain and the interconnections between its neurons and synapses might look.

In this case, these computational graphs, mapped to an IPU, allow the essence of these models to be glimpsed at a glance, showing a complexity in the connections that are reminiscent of the scans of a human brain, perhaps even recalling a microscopic view of some strange cellular or amoeboid structure.

This image shows the full training graph for Microsoft Research’s ResNet-34 architecture, as deployed on Graphcore’s IPU.

Poplar — the company’s graph compiler software — is the C++ based scalable programming framework that’s generating these beautiful mathematical diagrams. It’s intended for IPU-based systems, but the company is also working on building a wide-ranging, open source set of graph libraries for machine learning so that applications written in other machine learning frameworks — such as TensorFlow — can also be used on an IPU.

In addition, Poplar comes with a graph compiler that’s capable of translating and optimizing the operations used by other machine learning platforms into code that’s usable for the IPU. The graph compiler can then create and display a computational graph, as it has been translated into a visual map of computational relationships.

For instance, we see below here a computational graph of a machine learning model that has been used to analyze astrophysics data:

The following image below shows both the forward and backward training loop of AlexNet, a deep neural network (DNN) that was built using connected, convolutional layers. Created by Microsoft researchers, AlexNet is best known for besting other models for image classification back in 2012. Areas of high computational activity have been highlighted in luminous colors, and the neural network has been converted by the Poplar’s graph compiler into a computational graph depicting 18.7 million vertices and 115.8 million edges, with the vertices serving to show computational processes and edges showing the exchange of information between those processes, which vary in intensity within layers and between layers. Nodes of heightened fully connected activity are found in the areas numbered 6,7 and 8:


The company states that one of the advantages of using an IPU is that “it emphasizes massively parallel, low-precision floating point compute and provides much higher compute density than other solutions”. Not only can the graph processor be used for building, training and executing machine learning models of all kinds, but the whole model itself can be hosted on the IPU, meaning that models can be trained much faster and efficiently compared to CPUs or GPUs. Also, its architecture is designed to be extensible and will help accelerate deep learning applications, and when combined with the use of computational graphs, could allow researchers and developers to push innovation in unexpected directions. In any case, the graphs themselves offer a fascinating insight into how these artificial intelligence models might look as visual concepts, not just as abstract lines of code.

Images from Graphcore.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Unit.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.