Original URL: http://www.theregister.co.uk/2012/03/19/cia_internet_of_things/

ARM's ultra-low-power fridge-puter chips: Just what the CIA ordered

'He's just had a Scotch egg, sir' 'Ha! I knew it!'

By Chris Williams

Posted in Hardware, 19th March 2012 13:02 GMT

Prototypes of a new tiny, ultra low-power ARM-licensed processor will be demonstrated at an engineering conference in California next week. The chips are so small and energy efficient that they're aimed at wirelessly hooking up kitchen appliances, light bulbs and 'leccy meters to your network. And to the CIA.

Will this lead to sassy fridges ordering you to lose weight, based on your diet of crappy food, or intelligent heart-rate monitors advising you to stop reading about infuriatingly pointless shleb shenanigans in the news? Our fingers are crossed.

However, one group that certainly thinks it'll benefit massively from a surge in smart sensor proliferation is the world's spying organisations.

ARM Cortex-M0+ enter stage left

The concept of a smart home, or smart hospital ward, kitted out with tiny sensors is comfortably at least a decade old. By gluing microcontrollers (MCUs) to a bunch of detectors and wrapping them up in radio circuitry, you've suddenly got yourself intelligent little data broadcasters reporting back to a central decision-making storage hub.

There are plenty of tiny and very simple 8- and 16-bit microcontrollers out there to do this – but ARM thinks it can do better than everyone else in the low-power world and is determined to park its electric golf cart on the MCU industry's lawns. The Cambridge-based chip designer wants to take its powerful 32-bit architecture and drive it down to levels of power consumption enjoyed by more primitive 8-bit silicon, thus tempting engineers onto ARM's new Cortex-M0+ chips.

The M0+ follows the Cortex-M0 down the path of embedded simplicity. It uses ARM's compact Thumb instruction set; a rather barebones two-stage pipeline along which program code is fetched and executed; faster IO and flash memory access than before; an optional primitive memory protection unit that most manufacturers will leave out; and has added other speed and power tweaks to the design. There's no floating point unit although a ROM provided alongside the core can feature a maths library to provide routines for performing complicated calculations.

There's none of the huge cache, massive pipelines, multiprocessor interconnections, convoluted code execution reordering and other architectural bulk that weighs down Intel's powerhouses; the M0+ design starts off with just 12,000 gates. According to ARM CPU product manager Thomas Ensergueix, it's a completely new design started from scratch to push his company's platform further into the ultra low-power embedded world with minimal baggage.

These cores are expected to be wrapped up in flash and RAM in the order of scores of kilobytes, driven by a clock frequency of at most 50MHz, and draw 9 millionths of an amp per MHz on a 1.2V supply. The floor-plan area – the size the core will take up on a silicon die – is about a millimetre square, and it will cost manufacturers about 20 pence per core in royalties to ARM, we're told.

ARM's performance graph for the Cortex-M0+ compared to rival microcontroller cores. The graph represents just the code executing core and flash memory, using figures advertised by the rival manufacturers. The graph represents CoreMark benchmark performance per nano amp of current drawn.

Speaking to The Reg, Richard York, director of product marketing at ARM, said the 32-bit processor is aimed at embedded applications that need a bit more number-crunching power and perhaps more memory and interfaces, and better debugging support as embedded software complexity increases.

He argued that the amount of information that needs to be transferred can be reduced by using the extra processing oomph to massage raw sensor data in-core before transmitting it: this should further cut power requirements because broadcasting over the air is a significant current draw compared to what's consumed by the code-executing silicon gates.

Seasoned engineers told El Reg they are skeptical of this bold claim, arguing that once you throw in the communications software stack and protocols, the transmission overhead wipes out any power saving from sending 10 bytes instead of 200.

However in the case of a dumb microcontroller spraying a stream of, say, temperature readings to a larger decision-making computer, the benefit of replacing this component with a beefier chip that can turn around this data into a single packet to say "please turn off the heating" is more obvious. It's a delicate balancing act of power consumption in a technology scale where even the way a chip is wired up to the circuit board makes a significant difference.

York pointed towards Ember's ZigBee system-on-chips – which pack a Cortex-M3, 128KB of flash, 12KB of RAM and wireless personal networking circuitry – as an example of technology that can "cook data rather than leave it raw", maximising the efficiency of data transmission while maintaining portability. A hospital would prefer to strap tiny, wearable smart sensors to patients than have them tethered to heavy monitoring equipment, he said, as doctors "would rather have patients walking around than always strapped to a bed".

Why would ultra low-power devices matter to gadget makers and spooks?

The highly desirable low-power characteristic of a microcontroller defines battery life: how long the device can run off a portable energy supply and how much performance can be squeezed out of it. While the operational current draw is useful, what engineers really want to know is the standby current, because this determines how much will effectively leak out when the processor is doing absolutely nothing apart from waiting for an interrupt to stir it.

Freescale, which has licensed the new M0+ core for its Kinetis L range of microcontrollers, is keeping such figures close to its chest; it will reveal its finalised specifications in Q2 2012.

York estimated that, at full operation, a third of the current draw will come from the processor core (the 9uA/MHz figure), a third from the flash memory and a third from the peripheral sensor circuitry.

"If it's not well under 50uA [per Mhz] I'll be very surprised," said York, commenting on the overall current draw of an M0+ core with memory and interface circuitry connected. This is a power consumption of the chip while running a maths benchmark, representing a real-world number-crunching scenario; York accused one rival of running a NOP loop – a piece of code that essentially doesn't do anything – as a benchmark to brag about high performance while using little power on a microcontroller.

The success of the M0+ core and its uptake as a ubiquitous data-tossing processor will depend heavily on how low the chip manufacturers can squeeze the standby current.

CIA, enter stage right

The director of the US's Central Intelligence Agency, David Petraeus, speaking at a summit this month on intelligence gathering and engineering, touched on the concept of "an internet of things" - the notion of networking appliances and other objects together using such tiny, low-power radio-linked embedded chips to make to controlling gear easier (and monitoring people a piece of cake).

"In the digital world, data is everywhere, as you all know well. Data is created constantly, often unknowingly and without permission. Every byte left behind reveals information about location, habits, and, by extrapolation, intent and probable behaviour," said Petraeus.

"The number of data points that can be collected is virtually limitless — presenting, of course, both enormous intelligence opportunities and equally large counterintelligence challenges."

The spy boss was chiefly concerned with the huge amounts of data that can be collected from American citizens who intend to become CIA agents – in an age when parents set up Twitter and Tumblr accounts for their newborns, managing the identities of future operatives suddenly becomes non-trivial.

However, he went on to wax lyrical over smart sensors: "As you all know, exploiting the intelligence opportunities — which is an easier subject to discuss in an unclassified setting than the counterintelligence challenges — will require a new class of in-place and remote sensors that operate across the electromagnetic spectrum. Moreover, these sensors will be increasingly interconnected."

Referring to item tagging, sensors and wireless networks, and embedded engineering, Petraeus added that the proliferation of tiny, portable and intelligent sensor networks will appeal to his organisation. He said:

Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters — all connected to the next-generation internet using abundant, low cost, and high-power computing — the latter now going to cloud computing, in many areas greater and greater supercomputing, and, ultimately, heading to quantum computing.

In practice, these technologies could lead to rapid integration of data from closed societies and provide near-continuous, persistent monitoring of virtually anywhere we choose.

You can see alpha-grade low-power smart sensor chips in action on the Freescale stand at DESIGN West in San Jose, California next week. NXP is also a Cortex-M0+ licensee. ®