What does a complex AI model look like? Here's some Friday eye candy from UK biz Graphcore

Vivid images of machine learning graph processing

Pics Brit chip startup Graphcore has produced sexy images of its graph processing.

We wrote about Graphcore scoring $30m A-round funding in October. The processor designer came out of two years in XMOS-incubated stealth mode at that time, and was founded by CEO Nigel Toon and CTO Simon Knowles. XMOS is a fabless semiconductor company developing micro-controllers, and it was founded in 2005 in Bristol, UK. Toon was XMOS's CEO.

There is a wide spread of Graphcore investors: Bosch, Samsung, Amadeus Capital, C4 Ventures, Draper Esprit, Foundation Capital and Pitango Venture Capital.

A blog post by Toon reveals that the launchpad for the company was a realization that while machine-learning algorithms have been improving, the code is run on general-purpose CPUs aided and abetted by GPUs, such as those from Nvidia, which provide parallel processing capabilities.

Toon and Knowles decided it was feasible to design a specific graph processor, an IPU (intelligent processing unit) with associated Poplar software, a graph compiler and open source set of graph libraries for machine learning. The IPU hardware and Poplar software exist.

First place for image classification in the 2015 ImageNet Large Scale Visual Recognition Competition was won by Microsoft Research with its ResNet-50 deep neural network architecture. Graphcore has produced images, using Poplar, illustrating the ResNet-50 network. This is the network after it has been trained and as it would be used to classify images.

What the Poplar compiler does is convert, say, a description of Microsoft's 50-layer network into a computational graph of 3.22 million vertices and 6.21 million edges. This graph represents ResNet-50 as a parallel execution plan for Graphcore's IPU. The vertices (points) represent computation processes and the edges (lines or arcs) represent communication between the processes.

Here's such a graph with the layers in the graph labelled with the corresponding layers from the original technical paper:

Graphcore_ResNet_50_graph_labelled_650

Graphcore Poplar-produced ResNet-50 graph with labelled layers

Each layer has a different color. The visible clustering in the image is the result of intensive communication between processes in each layer of the network, with lighter communication between layers.

If we look more closely at a part of such an image more complexity is revealed:

Graphcore_ResNet_50_graph_extract_650

ResNet-50 graph detail. Click image for a larger view

The edges (lines or arcs) are quite clearly visible.

A Graphcore blog post, titled "inside an AI 'brain' – what does machine learning look like?" discusses this more and includes more image close-ups.

It seems to us that such a chart can be useful in seeing where processing resources are allocated. Graphcore says its IPU approach works well partly because deep networks executed efficiently as an entire model can be hosted on an IPU, escaping the external memory bottleneck said to limit GPU performance.

Because of this, Graphcore claims, its IPU can train machine learning models faster than x86 CPUs and/or GPUs. If this is true, and as machine learning/neural net systems become more popular, then Graphcore could find itself becoming core to machine learning. ®

Sponsored: Becoming a Pragmatic Security Leader




More from The Register

Videos of driving next to rendered versions

Like the Architect in the Matrix... but not: Nvidia trains neural network to generate cityscapes

GPU biz also open sources PhysX and teases beefy graphics card
nano

First, Google touts $150 AI dev kit. Now, Nvidia's peddling a $99 Nano for GPU ML tinkerers. Do we hear $50? $50?

GTC It only took three fscking hours of keynote to announce it – where's the GPU optimization for that?
Wind turbines, image via Shutterstock

Oooooklahoma, where the AI comes predictin' down the plain: Neural net spins up wind turbine power estimates

Grids may be able to better juggle solar electricity supplies using machine-learning code
China keyboard, image via Shutterstock

Alibaba wants to ship its own neural network silicon by H2 2019

Middle Kingdom calls for international AI arm linkage

How to stealthily poison neural network chips in the supply chain

Your free guide to trick an AI classifier into thinking an umbrella is the Bolivian navy on maneuvers in the South Pacific

Google flings $25m at Social Good AI contest, Baidu's whips up neural-net camera to treat eye diseases, and more

Roundup OpenAI builds curious bots and Nvidia's on the lookout for fresh ML talent

This image-recognition neural net can be trained from 1.2 million pictures in the time it takes to make a cup o' tea

Just 90 seconds, it's claimed, provided a) you have 512 Nvidia V100 GPUs and b) er, no need for accuracy
US_Navy

What's the frequency, KeNNeth? Neural nets trained to tune in on radar signals to boost future mobe broadband

It's time we rise up against these AI overlords and overthrow their useful technologies

Biting the hand that feeds IT © 1998–2019