Feeds

RISC daddy conjures Moore's Lawless parallel universe

The 64-core laptop

Intelligent flash storage arrays

The oft-cited Moore's Law is the fulcrum of the IT industry in that it has provided the means of giving us ever-faster and more sophisticated computing technology over the decades. This in turn allowed the IT industry to convince us that every one, two, or three years, we need new operating systems, better performance, and new and more complex applications. But ask yourself this: What happens to the IT industry if the performance improvements stop?

That is the question that one of the luminaries of the computer industry, David Patterson, posed last week with his keynote address at the SC08 supercomputing event in Austin, Texas. Patterson is one of the most important thinkers in computer science, and when he says there's a problem, people listen.

If you don't know who Patterson is, you know some of his work. At the University of California at Berkeley, where Patterson has been a member of the computer science faculty since 1977, he lead the design and implementation of the RISC I, what some have called the very first VLSI RISC computers and the foundation of what would eventually become Sun Microsystems' Sparc processor. Patterson was also the leader on another storage product called the Redundant Arrays of Inexpensive Disks (RAID) project, which made cheap PC-style disks look like more reliable mainframe-class disks in terms of reliability and capacity. RAID disks of various stripes are the norm in storage today.

These days, Patterson is still at Berkeley, and he runs the Parallel Computing Laboratory that is funded largely by Intel and Microsoft - Par Lab for short. As the name suggests, the lab is trying to tackle the parallel computing problem in new ways. Both corporate and consumer computing today is wrestling with this parallelism problem, right there in the data center and on the desktop, a problem that has plagued supercomputing for decades. Specifically, we have been trying to make many relatively slow computers do the work that would be done by a single, large, and imaginary computer. Yes, imaginary. The laws of physics (and particularly thermodynamics) don't allow you to build it.

In the old days of computing - which was only a few years ago - everyone expected that the ever-shrinking transistor would just enable faster and faster processors, thereby allowing single-threaded applications to run faster and faster. "This is an example of faith-based science," Patterson quipped in his opening, and he reminded everyone that he was among the people just a few years ago who just assumed that the chip-making processes would be available so chips could crank up the clocks and still be in a 100 watt thermal envelope. He showed what the chip roadmap looked like from the early 2000s looking ahead into 2005 and then how this was revamped:

Chip Roadmaps

As you can see, only as far back as 2005, the expectation was for a chip well above 20 GHz by 2013. And a few years later, the expectation shifted to possibly having chips at 5 GHz by 2007 and reaching up towards 8 GHz or so. Take a look at the actual Intel multicore line on the chart. We are stuck at around 3 GHz with x64 processors, and all that Moore's Law is getting us is more cores on a die with each passing year.

Single thread performance is stalled, with some exceptions here and there. We have hit a wall on thermals and it just wasn't practical to ramp up clock speeds any more. And so, we started cookie-cutting cores onto dies. First two, then four or more. And this is how we have been using Moore's Law to boost the performance inside a single CPU socket. But we are quite possibly engaging in more faith-based science, according to Patterson.

"The whole industry is assuming that we will solve this problem that a lot of people have broken their picks on." To make his point, Patterson told the nerds at SC08 to imagine a 32-core laptop and then an upgrade to a 64-core laptop, and then asked if they thought it would do more work on the kinds of workloads that run on a laptop. "We'd all bet against that. But that is the bet this industry has made."

Internet Security Threat Report 2014

Next page: Why Make the Bet?

More from The Register

next story
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
Astro-boffins start opening universe simulation data
Got a supercomputer? Want to simulate a universe? Here you go
prev story

Whitepapers

Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
The total economic impact of Druva inSync
Examining the ROI enterprises may realize by implementing inSync, as they look to improve backup and recovery of endpoint data in a cost-effective manner.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Simplify SSL certificate management across the enterprise
Simple steps to take control of SSL across the enterprise, and recommendations for a management platform for full visibility and single-point of control for these Certificates.