Reg comments1

DeepMind hopes its TensorFlow lib Sonnet is music to ears of AI devs

Code dumped on GitHub for you to play with this weekend (and maybe beyond)

Alphabet’s AI outfit DeepMind has released Sonnet, a framework that allows developers to construct neural network components more easily in TensorFlow.

TensorFlow is a popular machine learning library that was initially developed as proprietary software by Google Brain engineers. After DeepMind was acquired by Google in 2014, the entire research team began adopting TensorFlow, which it has now been using for about a year.

The software is geared towards deep learning and was open sourced in November 2015. Several TensorFlow libraries like Sonnet have cropped up since, but Sonnet is a little different as it has some features designed around DeepMind’s research requirements.

Open sourcing Sonnet will allow the AI community to get a better grip on the research, as DeepMind can share its models alongside published papers.

Neural networks are constructed from several fragments called modules; they complete a small set of tasks within the whole system. Sonnet makes it easier to code and implement these modules, DeepMind engineers explained in a blog post.

“Modules are ‘called’ with some input Tensors, which adds ops to the Graph and returns output Tensors. One of the design choices was to make sure the variable sharing is handled transparently by automatically reusing variables on subsequent calls to the same module.

“Many models in the literature can naturally be considered as a hierarchy – eg, a Differentiable Neural Computer contains a controller which might be an LSTM (long short-term memory) neural network, which can be implemented as containing a standard Linear layer.

“We’ve found that writing code which explicitly represents submodules allows easy code reuse and quick experimentation – Sonnet promotes writing modules which declare other submodules internally, or are passed other modules at construction time.” ®

Biting the hand that feeds IT © 1998–2017