Feeds

RISC daddy conjures Moore's Lawless parallel universe

The 64-core laptop

Internet Security Threat Report 2014

The New Parallel Paradigm

Another mind shift that the IT industry is going to have to undergo is a change in the way we think about programming. Patterson and his team believe that there should be two layers of software in this new parallel paradigm, one he called the efficiency layer and the other he called the productivity layer. The efficiency layer would be comprised of about 10 per cent of programmers, the experts at creating frameworks and libraries, the people who can get down close to the metal and wring out efficiencies in code. The remaining 90 per cent of programmers would work in the productivity layer, and they would be domain experts in particular fields or industries, who take the frameworks and libraries and turn them into applications.

Now here's the neat bit. To help make parallel programming easier, Par Lab's experts want to take advantage of parallelism itself and create "auto-tuners" that run lots of different optimizations on code as it is compiled and heuristically search for the best version of the compiled code to run on a particular piece of hardware. Patterson said that in early tests, an auto-tuner capable of machine learning was about 4,000 times faster than an expert at tuning the code - and tuning for parallel architectures is the big problem with those architectures.

There are a lot more challenges that the industry faces in coping with parallelism, and one of them might just be an explosion of custom-made processors, FPGAs, and other computing elements that get woven together into future systems that do not look like the relatively simple devices we called personal computers or servers a few years ago.

Patterson is also espousing that processors and the other elements of systems have standardized methods of gathering information on power and performance to feed back into the programming tools, so efficiency programmers can figure out why the system isn't using all of its available memory bandwidth or productivity programmers can do what-if analysis on what happens to thermals or performance in the system if they change their code.

"There was a decade or so where we were polishing a pretty round stone," Patterson explained. "Going forward, the field is really wide open, but research really has to deliver on this. The IT industry is really going to have to deliver on doubling the core count every year and on getting value out of that."

Either that or the software business collapses and a whole lot of IT jobs go out the window as the industry shifts from a growth market, where we all have software driving us to upgrade to faster (well, more capacious) systems to a replacement one where we just get a new one when the old one breaks. ®

Beginner's guide to SSL certificates

More from The Register

next story
Docker's app containers are coming to Windows Server, says Microsoft
MS chases app deployment speeds already enjoyed by Linux devs
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
'Urika': Cray unveils new 1,500-core big data crunching monster
6TB of DRAM, 38TB of SSD flash and 120TB of disk storage
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
SDI wars: WTF is software defined infrastructure?
This time we play for ALL the marbles
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
Oracle hires former SAP exec for cloudy push
'We know Larry said cloud was gibberish, and insane, and idiotic, but...'
Symantec backs out of Backup Exec: Plans to can appliance in Jan
Will still provide support to existing customers
prev story

Whitepapers

Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Win a year’s supply of chocolate
There is no techie angle to this competition so we're not going to pretend there is, but everyone loves chocolate so who cares.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.