DARPA funds Mr Spock on a Chip
Lyric's probability processor
The US Defense Advanced Research Projects Agency financed the basic research necessary to create a processor that thinks in terms of probabilities instead of the certainties of ones and zeros. And now Lyric Semiconductor, the spin-off from the Massachusetts Institute of Technology where the work was done, is going to spend the next couple of years building a commercial probability processor called the GP5.
Why do we care about a processor that calculates probabilities instead manipulating of data to get a certainty? Because an increasing number of applications in far-ranging fields are about trying to figure out the probability of something happening and acting on those probabilities.
"Digital processors are not really equipped to handle these algorithms," explains David Reynolds, co-founder and vice president of product development at Lyric, which comes out of stealth mode today. "So we have been rebuilding probability computing from the gate level all the way up to the processor."
In the IT racket, pulling signals out of noise and doing memory error correction are immediate areas where such probability processing will be immediately useful. As memories and flash get denser and smaller or as I/O bandwidth goes up, error rates go up and the need to correct for errors grows beyond the ability of error correction software or firmware to keep up. But even higher up in systems and application software, probability processing will come in handy. Think of your shopping habits online, where retailers make recommendations for products you might buy. They don't just want to cross-sell, but they want to reckon the probabilities of what you might buy as you are surfing through the store and pitch those specific products to you.
Spam and email filtering use probability algorithms to try to figure out what to let through to you, as does credit card and other financial transaction fraud detection. And obviously, hedge funds and other financial services firms use all kind of complex probability algorithms, as do weapons systems. Hence DARPA's keen interest from the get-go in Lyric.
The probability processing that Lyric has invented doesn't do the on/off processing of a normal logic circuit, but rather makes transistors function more like tiny dimmer switches, letting electron flow rates represent the probability of something happening. When you want to reckon the probability of multiple possible events happening, you measure the electrons and that give you the probability, which falls somewhere between 0 and 1. A digital processor has to figure out probabilities, and they do so today in great numbers.
Here's the difference. Reynolds says that a data center filled with servers that are calculating probabilities for, say, a financial model, will be able to consolidate from thousands of servers down to a single GP5 appliance to calculate probabilities. The reason is that the circuits that Lyric has invented - which have over 50 patents pending - are wickedly efficient at this. Digital logic that takes 500 transistors to do a probability multiply operation, for instance, can be done with just a few transistors on the Lyric chips. With an expected factor of 1,000 improvement over general purpose CPUs running probability algorithms, the energy savings of using GP5s instead of, say, x64 chips will be immense.
The five in GP5 is meant to designate that this probability processor is the fifth kind of general purpose processor, coming after central processors (CPUs), digital signal processors (DSPs), field programmable gate arrays (FPGAs), and graphics processing units (GPUs). Each of these other types of processors have specialized architectures, circuits, and programming languages to match. The GP5 will have circuits dedicated to probability-based inference and optimizations for speeding up that processing, as well as its own PSBL programming language.
The probability logic gates at the heart of Lyric's chips were developed by Ben Vigoda, the company's other co-founder, as part of his PhD thesis at MIT. This idea was further developed with an $18m grant from DARPA at MIT in 2006 and 2007. After the probability logic circuit concept was proven, Ray Stata, of Stata Venture Partners, kicked in a little over $2m to help further development of the probability circuits alongside DARPA. Lyric is based in Cambridge, Massachusetts, right on Kendall Square, and has 30 employees today.
From 2009 through 2013, Lyric is doing basic research to create a programmable probability processor, including a high-level programming language. DARPA has first dibs on the GP5 processors and this programming language, which is called Probability Synthesis to Bayesian Logic, or PSBL for short.
There are a lot of steps to go between now and when the GP5 chip will start sampling in 2013, and those steps including creating special fixed function circuits that are chips off the GP5 block that can help Lyric make some money to continue funding the development of the programmable probability chip.
First up is the Lyric Error Correction chip, which is in its second generation today and ready for licensing. This LEC chip is fabbed by Taiwan Semiconductor Manufacturing Corp using a very cheap 180 nanometer process that was perfected a zillion years ago. Reynolds says this chip is perfect for doing error correction on flash memory. Using current 30 nanometer flash memory wafer baking tech, for ever 1,000 bits you store, 1 bit comes back wrong when it is read and needs to be corrected.
Correcting for one error in 10,000 bits was hard enough, but 1 in 1,000 is a lot tougher. The digital ECC circuits used today can clean up errors so only one in every 1,000 trillion errors actually gets through. But, Reynolds says, with the next iteration of flash memory technology, the error rate will be 1 in 100 and error correction will have "a computational burden that is too high."
But the LEC chip that Lyric has created can be tiled to create fixed-function ECC for flash drives that can expand from 1 Gb/sec to 6 Gb/sec of bandwidth and yet be 30 to 70 times smaller than equivalent digital ECC circuits, use one-twelfth the power, and have four times the I/O bandwidth per pin between the flash memory and its controller. Adding such an ECC circuit to will allow flash memory used in mobile devices and servers to have higher bandwidth and a longer field life - and have higher densities without sacrificing data.
The current Lyric product roadmap calls for the first iteration of the PSBL language, the 1.0 release, to be available for licensing to selected partners by the end of the year. The LEC 2 ECC circuit is ready now, and the third-generation LEC 3 chip will sample using TSMC's 180 nanometer process in the second quarter of 2011. PSBL 2.0 debuts in the fourth quarter of 2011, with a 65 nanometer LEC 4 chip sampling in the first quarter of 2012.
The full-blown GP5 chip will debut in 2013, and Reynolds is not tipping his cards on what process this chip will use, or what features it will have on it. When pressed, Reynolds said that the GP5 is being designed at the moment and has been underway "for a while." How big and complex it gets will be determined by early customers, notably DARPA.
The GP5 will need its own local memory, just like GPUs do today, and Reynolds confirmed that it very likely will appear on co-processor cards on a PCI-Express bus, much as GPU and cryptographic co-processors do today. As for pricing, Reynolds says that like other niche and high-function products, the GP5 will be expensive at first, "but eventually we will all get it on our desktops." ®
old is new ?
Analogue computers are back ?
accelerated heuristic appreciation of aggregated input, speeding up emergence of goodies for participants to act upon - intelligently.
I could use some of that in my pie.
...target aquired. The processor says it's 99.999% chance of being friendly.
Well that's 0.001% chance it isn't "Open Fire!"
...before we see an improbability chip? Then all we need to do is plug it in to an extra hot cup of tea and we can invent the improbability drive, which will revolutionise space travel.
After all, the only thing you really need to know is exactly how improbable an event is....
Neural network anyone?
From this article it sounds a lot like the sort of semi-analogue implementation of neural network blocks, with weighted inputs making the decisions for subsequent steps, and one 'neuron' in effect replacing a whole lot of multiply/accumulates and the simple end threshold step. Anyone know more?