The Register® — Biting the hand that feeds IT

Feeds

Intel's Tri-Gate gamble: It's now or never

Deep dive into Chipzilla's last chance at the low end

Analysis There are two reasons why Intel is switching to a new process architecture: it can, and it must.

The most striking aspect of Intel's announcement of its new Tri-Gate process isn't the architecture itself, nor is it the eye-popping promises of pumped-up performance and dialed-down power. And it certainly isn't the Chipzillian marketeers' risible emphasis on "3-D".

No, it's that Intel has not only the know-how and ready cash, but also the never-surrender cojones to pull off such a breakthrough. The world's number-one microprocessor designer is not content to merely dominate some sectors of the market for silicon brains, it wants its parts to be in every thinking device, from the lowliest embedded systems to the brainiest HPC clusters.

And Intel is willing to invest – bet? – big to make that happen.

The move to a 22nm Tri-Gate process architecture is an important step for Intel's entire microprocessor line, but it's especially critical for the company's desire to enter the low-power world of tablets and smartphones – and whatever consumer-level world-changers might appear next.

But although Tri-Gate is undeniably a breakthrough – more on the deep-tech details in a moment – it was not an unexpected one. The basic idea behind what Intel calls Tri-Gate is amalgamated into a concept that the rest of the known universe calls FinFET – cute geek-speak for a vertical "fin" of silicon poking up into a field-effect transistor's gate.

A fin in a FET – get it? There are a number of FinFET-ish architectures under study in labs around the world. Intel calls theirs Tri-Gate, due to the fact that the fin has a left, right, and top surface upon which charge can flow through the gate.

FinFET-based process architectures are far from new. Intel has been futzing around with the concept since 2002, and the Taiwanese chip-baking giant TSMC demonstrated a 25nm FinFET design it dubbed "Omega" at about the same time.

What Intel has now done, however, is not merely demo another FinFET concept, but to throw the full weight of its manufacturing prowess and financial clout behind FinFET Tri-Gate, and move its entire microprocessor line to the new process.

Intel's breakthrough isn't conceptual and lab-based, it's real and market-based.

And it's very, very expensive. Intel is spending $8bn to upgrade four fabs in Oregon, Arizon, and Israel to 22nm Tri-Gate, and to create a fifth one, also in Oregon, from scratch. At the same time, by the way, Intel is also investing a good chunk of its $10.2bn 2011 capex budget on the development fab for its upcoming 14nm process, and bulding that fab bigger than originally planned.

Eight billion dollars for 22nm Tri-Gate fabs is not chump change – especially to a company that recently bought McAfee for $7.7bn and Infineon for $1.4bn, and that spent over $7bn since early 2009 on other fab upgrades.

Oh and let's not forget the little $1bn annoyance that was the Cougar Point chipset flaw this winter.

But Intel has the resources it needs to move to 22nm Tri-Gate. After all, acquisitions, capex, and "oops" are not the only ways that the company is doling out cash these days. The company also spent $4bn to repurchase 189 million shares of common stock in its most recent quarter – a quarter during which it also paid just under $1bn in dividends to stockholders. (Are you listening, Steve Jobs?)

Not that Intel has a cash hoard equivalent to Apple's $66bn: its cash and cash equivalents, short-term investments, and trading assets total just under $12bn. But compare that with its closest microprocesser rival, the fabless AMD, which has just $1.75bn in the bank. Intel's market capitalization is $127bn; AMD's is $6bn.

Intel's other main rival, of course – and one that's growing in importance – is also fabless and also goes by a TLA: ARM. If Intel's engineering chops and deep pockets are the proof of the "because it can" reasoning behind its move to Tri-Gate, ARM – and, to a lesser extent, AMD – is the driving force behind "because it must".

Let's back up a few years to 2007, the year in which Intel introduced its most recent Really Big Thing™ in process technology. That's the year when the company replaced the traditional silicon dioxide gate dielectric in its microprocessors' transistors with a high-k metal gate, which increased gate capacitance, thus improving performance while significantly reducing current leakage.

The high-k metal gate helped make it possible to shrink that generation's process technology to 45nm. And we're all familiar with the process-shrinking mantra: smaller processes mean lower power, faster performance, less heat, Moore's Law lives to fight another day, and blah, blah, blah.

The difference, however, between 2007 and today was that four years ago Intel was riding high with nary a serious threat in sight (sorry, AMD fans). The company's move to the 45nm "Penryn" line was arguably prompted mostly by a desire to induce upgrades from users of the previous generation of 65nm "Conroe" processors, and to sell data center folks on the cost savings of lower-power parts.

That was then. Things have changed.

Re: Interesting thought...

No it wouldn't. Merging with someone like AMD would kill ARM.

AMD licensing ARM designs would be more interesting, and any of ARM's current licensees moving to the same 22nm Tri-Gate process would effectively kill Intel's chance of ever gaining any momentum in the high-performance, low-power market.

It's about time ARM made significant inroads into the desktop market. Intel have been making our (programmer's) lives hell for long enough.

9
0

Unlikely?

"Or, for that matter, Intel could license the ARM architecture and start buiding its own ARM variants in its own fabs, using its 22nm Tri-Gate process"

I'd have thought, more like a dead cert, than unlikely.

If Intel has the best process technology for low-power devices, ARM without question has a better CPU architecture for low-power devices like Smartphones. Put them together and what do you get? The best possible Smartphone CPU, that can either double battery runtime, or allow for a large cut in the weight of the phone without any loss of runtime.

If Intel suffers from the "not invented here" syndrome, Smartphone manufacturers will have to choose between i86 architecture running on the best Silicon, or ARM running on less good silicon. It won't be so long before TSMC or some other chip foundry catches up with Intel enough to put ARM back at the front of the pack. Best for Intel if it's Intel that makes the best mobile device chips.

7
0

Whatever Intel can do with FinFET then ARM can also do

@"The move to a 22nm Tri-Gate process architecture is an important step for Intel's entire microprocessor line"

It is important because once the ARM A15 design is here, Intel will start to loose the future server market on processing power per watt. (The ARM A15 will allow the design of servers with more processing power for less electrical power than an Intel CPU based design. That's win win for ARM and Checkmate for Intel's bloated x86 design).

Intel's market lead and dominance up until now has largely depended on Intel's ability to define what each new generation of x86 design should be, so they were always first to market with each new x86 generation. That meant each time the x86 design changed, AMD had to spend time playing catch up to add each new addition to the ever more bloated and increasingly complex x86 design. This constantly changing x86 design gave Intel the marketing lead over AMD, because Intel were always going to be first to market with each new generation.

Unfortunately for Intel, ARM are not playing that same game. ARM processor designs are more power efficient than x86 designs. So whatever chip making process Intel uses, they still can't win, as an ARM design based on that same chip making process will win over the x86 design. Ironically for Intel, their bloated x86 design that was so useful for holding back AMD, is now holding them back from competing with ARM.

Which leaves software (not hardware) as Intel's remaining x86 strength and even that is just in its legacy support. Its a pain to recompile programs and some old programs won't be recompiled (the companies who created the code may not even be around any more). But for all new code, its really not an issue. Plus ARM has a lot of software support. For example Linux has been on ARM for years. Android and iPhone already support ARM. Even Microsoft are looking to support ARM. Also ARM software development has had 28 years to evolve to a very high level of industry support with good free tools. So ARM is strong in software support as well as beating the x86 design in processing power per watt.

So Intel are in trouble. AMD isn't Intel's biggest competitor, its the over 200 companies that all license the ARM designs that are Intel's biggest threat, because together these over 200 companies could seriously harm Intel's market dominance. So Intel really are in trouble.

6
0

Cheers.

The article made even someone as thick as me think that they understand the concepts; thanks.

4
0

Fudge

"The answer- to blow sugar up your ass."

I'm not sure that's all it's cracked up to be.

4
0

More from The Register

Fanbois vs fandroids: Punters display 'tribal loyalty'
Buying a new mobe? You'll stick with the same maker - survey
iPhone 5 totters at the top as Samsung thrusts up UK mobe chart
But older Apples are still holding their own
Google to Glass devs: 'Duh! Go ahead, hack your headset'
'We intentionally left the device unlocked'
Japan's naughty nurses scam free meals with mobile games
Hungry women trick unsuspecting otaku into paying for grub
 breaking news
Turn off the mic: Nokia gets injunction on 'key' HTC One component
Dutch court stops Taiwanese firm from using microphones
Next Xbox to be called ‘Xbox Infinity’... er... ‘Xbox’
We don’t know. Maybe Microsoft doesn’t (yet) either
AMD reveals potent parallel processing breakthrough
Upcoming Kaveri processor will drink from shared-memory Holy Grail
Barnes & Noble bungs Raspberry Pi-priced Nook on shelves
That makes the cheap-as-chips e-reader cool now, right?