Feeds

Big Blue bigwig: Tiny processor knobs can't shrink forever

You cannae break the laws of physics - and 7nm is the limit

Internet Security Threat Report 2014

HPC blog While at IBM’s Smarter Computing Summit last week, I had the great pleasure of hearing Big Blue's Bernie Meyerson talk about limits to today’s tech, and the associated implications.

Bernie is IBM’s VP of Innovation and one of the rare technologist-scientist types who can clearly and directly explain highly technical concepts in a way that they can be understood by a reasonably intelligent grey squirrel (and me too).

Even better, he’s highly entertaining and doesn’t hedge when it comes to stating what’s what in the world. Back in 2003 he predicted that Intel would never deliver on its promises of 4 to 5GHz CPUs and would, in fact, be forced to shift to multi-core processors.

Meyerson backed up his brash prediction (it was plenty brash back then) by sharing electron microscope images of individual atoms that showed they’re kind of lumpy. The problem with lumpy atoms is that when you use only a handful of them to build gates, they leak current like a sieve. When asked about this, Intel denied over and over that there was a problem – right up to the point when it announced it was scrapping its entire product strategy in favour of a multi-core approach.

So when Meyerson talks, I pay attention. And Meyerson is talking again.

In his presentation at the Pinehurst golf resort in North Carolina, he was again playing on the theme that we can’t shrink our way to higher performance any more. In fact, when it comes to chips, we have only a generation or two left before we reach the end of the line.

So where’s the end of the line? According to Bernie: 7 to 9 nanometers. When the features on a chip get to this minute size, you start to see quantum mechanics effects that are “very nasty” that impairs the performance of the processor's decision-making gates.

The problems at 7nm are profound to the point where there isn’t really any way around them – it’s just too damned small – and there isn’t a way to scale down an atom. It’s a fundamental limit, and it’s finally in sight. Chips in mass production these days have a 32nm or 22nm feature size, and 14nm is not far down the line.

Unfortunately, I can’t toss around the correct scientific terms to pretend I know what I’m talking about here. I have only my own deplorable notes for reference; plus Meyerson’s time slot forced him to move pretty quickly through his material. But he was probably talking about quantum tunneling, a phenomenon where particles (such as electrons) travel through barriers, like those in very thin semiconductor gates, that they should not cross. This results in lots of current leaking from these tiny switches, relatively speaking, which ramps the device's power consumption.

Meyerson also talked about the limitations facing us on the storage side. Like most great stories (and many great ideas too), it starts in a bar. In this case, it was Bernie in a bar with a bunch of other smart guys, probably knocking back drinks that aren’t accessorised with little umbrellas. Like all barroom conversations, the topic eventually turned to magnetic storage density. More specifically: how many atoms would you need to reliably store a single bit of data?

This prompted some non-barroom research and scientific activity. The resulting answer? Twelve. It takes twelve atoms to reliably store a bit of data. Any less and you lose stability, meaning that parts of the data might disappear, or morph into something you didn’t store. This is related to the same quantum effects discussed above and are ultimately the result of the fact that we can’t scale atoms down to a handier size.

From what Meyerson said, it sounds like we have a bit more room before we start to run up against the limit on storage density. If my notes are correct, we won’t approach the 12-atom limit until we get around 100 times more dense. Right now, a 1TB per platter is the highest density available. Theoretically, we may be able to get to 100TB per platter and 300TB per drive at maximum density.

So how long do we have until we hit the limit? It depends on how fast density grows. Historically, we’ve seen density grow anywhere between 20 per cent and 100 per cent per year. Lately (last decade or so), growth has ranged between 20 per cent and 40 per cent annually, meaning that we might hit the twelve atom limit in as few as 13 years or as long as 25 years.

That’s an eternity in the tech business – maybe even long enough for someone to figure out how to shrink atoms down to a more convenient size. ®

Beginner's guide to SSL certificates

More from The Register

next story
The cloud that goes puff: Seagate Central home NAS woes
4TB of home storage is great, until you wake up to a dead device
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Want to STUFF Facebook with blatant ADVERTISING? Fine! But you must PAY
Pony up or push off, Zuck tells social marketeers
Oi, Europe! Tell US feds to GTFO of our servers, say Microsoft and pals
By writing a really angry letter about how it's harming our cloud business, ta
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Intel offers ingenious piece of 10TB 3D NAND chippery
The race for next generation flash capacity now on
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Getting ahead of the compliance curve
Learn about new services that make it easy to discover and manage certificates across the enterprise and how to get ahead of the compliance curve.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.