This article is more than 1 year old

RISC daddy conjures Moore's Lawless parallel universe

The 64-core laptop

The New Parallel Paradigm

Another mind shift that the IT industry is going to have to undergo is a change in the way we think about programming. Patterson and his team believe that there should be two layers of software in this new parallel paradigm, one he called the efficiency layer and the other he called the productivity layer. The efficiency layer would be comprised of about 10 per cent of programmers, the experts at creating frameworks and libraries, the people who can get down close to the metal and wring out efficiencies in code. The remaining 90 per cent of programmers would work in the productivity layer, and they would be domain experts in particular fields or industries, who take the frameworks and libraries and turn them into applications.

Now here's the neat bit. To help make parallel programming easier, Par Lab's experts want to take advantage of parallelism itself and create "auto-tuners" that run lots of different optimizations on code as it is compiled and heuristically search for the best version of the compiled code to run on a particular piece of hardware. Patterson said that in early tests, an auto-tuner capable of machine learning was about 4,000 times faster than an expert at tuning the code - and tuning for parallel architectures is the big problem with those architectures.

There are a lot more challenges that the industry faces in coping with parallelism, and one of them might just be an explosion of custom-made processors, FPGAs, and other computing elements that get woven together into future systems that do not look like the relatively simple devices we called personal computers or servers a few years ago.

Patterson is also espousing that processors and the other elements of systems have standardized methods of gathering information on power and performance to feed back into the programming tools, so efficiency programmers can figure out why the system isn't using all of its available memory bandwidth or productivity programmers can do what-if analysis on what happens to thermals or performance in the system if they change their code.

"There was a decade or so where we were polishing a pretty round stone," Patterson explained. "Going forward, the field is really wide open, but research really has to deliver on this. The IT industry is really going to have to deliver on doubling the core count every year and on getting value out of that."

Either that or the software business collapses and a whole lot of IT jobs go out the window as the industry shifts from a growth market, where we all have software driving us to upgrade to faster (well, more capacious) systems to a replacement one where we just get a new one when the old one breaks. ®

More about

TIP US OFF

Send us news


Other stories you might like