TensorFlow lightens up to land on smartmobes, then embed everywhere

Thanks for coming, TensorFlow Mobile, TensorFlow Lite is what the cool kids will code with now

By Richard Chirgwin

Posted in Artificial Intelligence, 15th November 2017 06:30 GMT

Google's released an Android/iOS version of TensorFlow.

The Chocolate Factory announced the developer preview of TensorFlow Lite in this Tuesday blog post. The post stated the release will initially target smartmobes, with later versions to target embedded devices.

Google first revealed its desire for machine learning everywhere at its I/O conference in May.

Pushing machine learning out to the devices makes sense, since it reduces latency for those running inference, and Google's not the only company to spot that. Qualcomm, for example, first announced its mobile-specific silicon, Zeroth, in 2013.

Google explained that TensorFlow Lite's architecture assumes that the grunt work of model training will happen upstream, as shown in the graphic below.

Google listed the tool's components thus:

Out on the target smartphone, a C++ API (native on iOS; wrapped in a Java API on Android) loads the TensorFlow Lite model and calls the interpreter.

A fully-loaded interpreter is 300 KB, including all machine learning operators (on its own, the interpreter is just 70 KB). Google notes that the current TensorFlow Mobile is 1.5 MB.

Androids can also offload processing to hardware accelerators if they're available, using the Android Neural Networks API.

Models available to TensorFlow Lite include the MobileNet and Inception v3 vision models; and the Smart Reply conversational model.

For now, TensorFlow Mobile stays on Google's books. Google's announcement stated that it viewed TensorFlow Mobile as the system to support production applications. However: “Going forward, TensorFlow Lite should be seen as the evolution of TensorFlow Mobile, and as it matures it will become the recommended solution for deploying models on mobile and embedded devices”. ®

Sign up to our NewsletterGet IT in your inbox daily

3 Comments

More from The Register

Teradata lights candles, turns on TAP, runs itself soothing analytics bath

Spark, TensorFlow, Gluon, Theano – room for everyone

What do Tensor Flow, Caffe and Torch have in common? Open CVEs

Sooner or later, dependency hell creates a problem for everyone

Nvidia says Google's TPU benchmark compared wrong kit

You're faster than the Kepler, but what about the newer and better Pascal?

Google ramping up AI in China, Nvidia's Titan V, Intel's hip-hop misstep

Roundup And more in your machine-learning news summary

Teradata flashifies its data warehousing box

Adds entry-level product, support for Hadoop and its own DW.BI software

Teradata pays ex-prez €4.2m to close 'invalid termination' settlement

German court finds against the chopping of German man

AI, AI, Pure: Nvidia cooks deep learning GPU server chips with NetApp

Pure Storage's AIRI reference architecture probably a bit jelly

New year, new Teradata but the 'transformation' hangover still lingers

Helped along by billion dollar-plus cash pile outside USA

How's that transformation coming, Teradata? Numbers down as org morphs

'We're not naive about work to be done,' as revs drop 14%

Try our new driverless car software says Nvidia, as it suspends driverless car trials

Post crash test hits share price