Feeds

Big Blue bigwig: Tiny processor knobs can't shrink forever

You cannae break the laws of physics - and 7nm is the limit

Maximizing your infrastructure through virtualization

HPC blog While at IBM’s Smarter Computing Summit last week, I had the great pleasure of hearing Big Blue's Bernie Meyerson talk about limits to today’s tech, and the associated implications.

Bernie is IBM’s VP of Innovation and one of the rare technologist-scientist types who can clearly and directly explain highly technical concepts in a way that they can be understood by a reasonably intelligent grey squirrel (and me too).

Even better, he’s highly entertaining and doesn’t hedge when it comes to stating what’s what in the world. Back in 2003 he predicted that Intel would never deliver on its promises of 4 to 5GHz CPUs and would, in fact, be forced to shift to multi-core processors.

Meyerson backed up his brash prediction (it was plenty brash back then) by sharing electron microscope images of individual atoms that showed they’re kind of lumpy. The problem with lumpy atoms is that when you use only a handful of them to build gates, they leak current like a sieve. When asked about this, Intel denied over and over that there was a problem – right up to the point when it announced it was scrapping its entire product strategy in favour of a multi-core approach.

So when Meyerson talks, I pay attention. And Meyerson is talking again.

In his presentation at the Pinehurst golf resort in North Carolina, he was again playing on the theme that we can’t shrink our way to higher performance any more. In fact, when it comes to chips, we have only a generation or two left before we reach the end of the line.

So where’s the end of the line? According to Bernie: 7 to 9 nanometers. When the features on a chip get to this minute size, you start to see quantum mechanics effects that are “very nasty” that impairs the performance of the processor's decision-making gates.

The problems at 7nm are profound to the point where there isn’t really any way around them – it’s just too damned small – and there isn’t a way to scale down an atom. It’s a fundamental limit, and it’s finally in sight. Chips in mass production these days have a 32nm or 22nm feature size, and 14nm is not far down the line.

Unfortunately, I can’t toss around the correct scientific terms to pretend I know what I’m talking about here. I have only my own deplorable notes for reference; plus Meyerson’s time slot forced him to move pretty quickly through his material. But he was probably talking about quantum tunneling, a phenomenon where particles (such as electrons) travel through barriers, like those in very thin semiconductor gates, that they should not cross. This results in lots of current leaking from these tiny switches, relatively speaking, which ramps the device's power consumption.

Meyerson also talked about the limitations facing us on the storage side. Like most great stories (and many great ideas too), it starts in a bar. In this case, it was Bernie in a bar with a bunch of other smart guys, probably knocking back drinks that aren’t accessorised with little umbrellas. Like all barroom conversations, the topic eventually turned to magnetic storage density. More specifically: how many atoms would you need to reliably store a single bit of data?

This prompted some non-barroom research and scientific activity. The resulting answer? Twelve. It takes twelve atoms to reliably store a bit of data. Any less and you lose stability, meaning that parts of the data might disappear, or morph into something you didn’t store. This is related to the same quantum effects discussed above and are ultimately the result of the fact that we can’t scale atoms down to a handier size.

From what Meyerson said, it sounds like we have a bit more room before we start to run up against the limit on storage density. If my notes are correct, we won’t approach the 12-atom limit until we get around 100 times more dense. Right now, a 1TB per platter is the highest density available. Theoretically, we may be able to get to 100TB per platter and 300TB per drive at maximum density.

So how long do we have until we hit the limit? It depends on how fast density grows. Historically, we’ve seen density grow anywhere between 20 per cent and 100 per cent per year. Lately (last decade or so), growth has ranged between 20 per cent and 40 per cent annually, meaning that we might hit the twelve atom limit in as few as 13 years or as long as 25 years.

That’s an eternity in the tech business – maybe even long enough for someone to figure out how to shrink atoms down to a more convenient size. ®

The Power of One eBook: Top reasons to choose HP BladeSystem

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
Disaster Recovery upstart joins DR 'as a service' gang
Quorum joins the aaS crowd with DRaaS offering
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.