Feeds

Big Blue bigwig: Tiny processor knobs can't shrink forever

You cannae break the laws of physics - and 7nm is the limit

Choosing a cloud hosting partner with confidence

HPC blog While at IBM’s Smarter Computing Summit last week, I had the great pleasure of hearing Big Blue's Bernie Meyerson talk about limits to today’s tech, and the associated implications.

Bernie is IBM’s VP of Innovation and one of the rare technologist-scientist types who can clearly and directly explain highly technical concepts in a way that they can be understood by a reasonably intelligent grey squirrel (and me too).

Even better, he’s highly entertaining and doesn’t hedge when it comes to stating what’s what in the world. Back in 2003 he predicted that Intel would never deliver on its promises of 4 to 5GHz CPUs and would, in fact, be forced to shift to multi-core processors.

Meyerson backed up his brash prediction (it was plenty brash back then) by sharing electron microscope images of individual atoms that showed they’re kind of lumpy. The problem with lumpy atoms is that when you use only a handful of them to build gates, they leak current like a sieve. When asked about this, Intel denied over and over that there was a problem – right up to the point when it announced it was scrapping its entire product strategy in favour of a multi-core approach.

So when Meyerson talks, I pay attention. And Meyerson is talking again.

In his presentation at the Pinehurst golf resort in North Carolina, he was again playing on the theme that we can’t shrink our way to higher performance any more. In fact, when it comes to chips, we have only a generation or two left before we reach the end of the line.

So where’s the end of the line? According to Bernie: 7 to 9 nanometers. When the features on a chip get to this minute size, you start to see quantum mechanics effects that are “very nasty” that impairs the performance of the processor's decision-making gates.

The problems at 7nm are profound to the point where there isn’t really any way around them – it’s just too damned small – and there isn’t a way to scale down an atom. It’s a fundamental limit, and it’s finally in sight. Chips in mass production these days have a 32nm or 22nm feature size, and 14nm is not far down the line.

Unfortunately, I can’t toss around the correct scientific terms to pretend I know what I’m talking about here. I have only my own deplorable notes for reference; plus Meyerson’s time slot forced him to move pretty quickly through his material. But he was probably talking about quantum tunneling, a phenomenon where particles (such as electrons) travel through barriers, like those in very thin semiconductor gates, that they should not cross. This results in lots of current leaking from these tiny switches, relatively speaking, which ramps the device's power consumption.

Meyerson also talked about the limitations facing us on the storage side. Like most great stories (and many great ideas too), it starts in a bar. In this case, it was Bernie in a bar with a bunch of other smart guys, probably knocking back drinks that aren’t accessorised with little umbrellas. Like all barroom conversations, the topic eventually turned to magnetic storage density. More specifically: how many atoms would you need to reliably store a single bit of data?

This prompted some non-barroom research and scientific activity. The resulting answer? Twelve. It takes twelve atoms to reliably store a bit of data. Any less and you lose stability, meaning that parts of the data might disappear, or morph into something you didn’t store. This is related to the same quantum effects discussed above and are ultimately the result of the fact that we can’t scale atoms down to a handier size.

From what Meyerson said, it sounds like we have a bit more room before we start to run up against the limit on storage density. If my notes are correct, we won’t approach the 12-atom limit until we get around 100 times more dense. Right now, a 1TB per platter is the highest density available. Theoretically, we may be able to get to 100TB per platter and 300TB per drive at maximum density.

So how long do we have until we hit the limit? It depends on how fast density grows. Historically, we’ve seen density grow anywhere between 20 per cent and 100 per cent per year. Lately (last decade or so), growth has ranged between 20 per cent and 40 per cent annually, meaning that we might hit the twelve atom limit in as few as 13 years or as long as 25 years.

That’s an eternity in the tech business – maybe even long enough for someone to figure out how to shrink atoms down to a more convenient size. ®

New hybrid storage solutions

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.