Feeds

RISC daddy conjures Moore's Lawless parallel universe

The 64-core laptop

3 Big data security analytics techniques

The oft-cited Moore's Law is the fulcrum of the IT industry in that it has provided the means of giving us ever-faster and more sophisticated computing technology over the decades. This in turn allowed the IT industry to convince us that every one, two, or three years, we need new operating systems, better performance, and new and more complex applications. But ask yourself this: What happens to the IT industry if the performance improvements stop?

That is the question that one of the luminaries of the computer industry, David Patterson, posed last week with his keynote address at the SC08 supercomputing event in Austin, Texas. Patterson is one of the most important thinkers in computer science, and when he says there's a problem, people listen.

If you don't know who Patterson is, you know some of his work. At the University of California at Berkeley, where Patterson has been a member of the computer science faculty since 1977, he lead the design and implementation of the RISC I, what some have called the very first VLSI RISC computers and the foundation of what would eventually become Sun Microsystems' Sparc processor. Patterson was also the leader on another storage product called the Redundant Arrays of Inexpensive Disks (RAID) project, which made cheap PC-style disks look like more reliable mainframe-class disks in terms of reliability and capacity. RAID disks of various stripes are the norm in storage today.

These days, Patterson is still at Berkeley, and he runs the Parallel Computing Laboratory that is funded largely by Intel and Microsoft - Par Lab for short. As the name suggests, the lab is trying to tackle the parallel computing problem in new ways. Both corporate and consumer computing today is wrestling with this parallelism problem, right there in the data center and on the desktop, a problem that has plagued supercomputing for decades. Specifically, we have been trying to make many relatively slow computers do the work that would be done by a single, large, and imaginary computer. Yes, imaginary. The laws of physics (and particularly thermodynamics) don't allow you to build it.

In the old days of computing - which was only a few years ago - everyone expected that the ever-shrinking transistor would just enable faster and faster processors, thereby allowing single-threaded applications to run faster and faster. "This is an example of faith-based science," Patterson quipped in his opening, and he reminded everyone that he was among the people just a few years ago who just assumed that the chip-making processes would be available so chips could crank up the clocks and still be in a 100 watt thermal envelope. He showed what the chip roadmap looked like from the early 2000s looking ahead into 2005 and then how this was revamped:

Chip Roadmaps

As you can see, only as far back as 2005, the expectation was for a chip well above 20 GHz by 2013. And a few years later, the expectation shifted to possibly having chips at 5 GHz by 2007 and reaching up towards 8 GHz or so. Take a look at the actual Intel multicore line on the chart. We are stuck at around 3 GHz with x64 processors, and all that Moore's Law is getting us is more cores on a die with each passing year.

Single thread performance is stalled, with some exceptions here and there. We have hit a wall on thermals and it just wasn't practical to ramp up clock speeds any more. And so, we started cookie-cutting cores onto dies. First two, then four or more. And this is how we have been using Moore's Law to boost the performance inside a single CPU socket. But we are quite possibly engaging in more faith-based science, according to Patterson.

"The whole industry is assuming that we will solve this problem that a lot of people have broken their picks on." To make his point, Patterson told the nerds at SC08 to imagine a 32-core laptop and then an upgrade to a 64-core laptop, and then asked if they thought it would do more work on the kinds of workloads that run on a laptop. "We'd all bet against that. But that is the bet this industry has made."

SANS - Survey on application security programs

Next page: Why Make the Bet?

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Dropbox defends fantastically badly timed Condoleezza Rice appointment
'Nothing is going to change with Dr. Rice's appointment,' file sharer promises
Bored with trading oil and gold? Why not flog some CLOUD servers?
Chicago Mercantile Exchange plans cloud spot exchange
Just what could be inside Dropbox's new 'Home For Life'?
Biz apps, messaging, photos, email, more storage – sorry, did you think there would be cake?
IT bods: How long does it take YOU to train up on new tech?
I'll leave my arrays to do the hard work, if you don't mind
Amazon reveals its Google-killing 'R3' server instances
A mega-memory instance that never forgets
Cisco reps flog Whiptail's Invicta arrays against EMC and Pure
Storage reseller report reveals who's selling what
prev story

Whitepapers

Designing a defence for mobile apps
In this whitepaper learn the various considerations for defending mobile applications; from the mobile application architecture itself to the myriad testing technologies needed to properly assess mobile applications risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.