Feeds

Intel plays Switzerland in the cloud wars

Who will bear the ARM 'standard'?

Maximizing your infrastructure through virtualization

Comment Whenever someone starts waving "standards," it is always a prelude to war. With the launch of the Open Data Center Alliance today by 70 IT organizations (some of whom are IT suppliers), Intel is trying to position itself as the neutral player in the coming cloud wars. Switzerland benefited by being the bankers for warring countries, and Intel seeks to benefit by maintaining and extending its dominance in the server racket.

In the computer industry, a standard is a bit of a double-edged sword. It is meant to be some kind of peace offering, a compromise between warring factions who argue over how different hardware and software components plug into each other or how people and devices talk to software running on a system (that's the cooperative edge. But at the same time, a standard is brandished like a weapon (that's the competitive edge) to wound other players and chase them away from the pile of money in the data center or on the desktop.

No one ever argues against standards, of course. It is a bit like arguing against world peace or trying to persuade everyone that they are not entitled to life, liberty, and the pursuit of happiness. But standards in the computer business - real standards, developed cooperatively, endorsed by vendors, and supported by the budget dollars of end users - are hard to come by.

More times than not, the standard is set by the last man standing in a particular market. That's how we got TCP/IP instead of myriad other network protocols. That's why you can't kill Ethernet no matter how hard you try. And that's also why there still is not a Unix standard, or blade form factor server standards, or even virtual machine standards - and there never will be. The "might makes right" aspects of standards is why people refer to x64-based servers as "industry standard servers" when what they really mean to say is "volume servers that have crushed most other platforms out of existence or driven them into legacy status."

In fact, at the launch of the Open Data Center Alliance in San Francisco today, Kirk Skaugen, general manager of Intel's Data Center Group – which makes chips and chipsets for PCs and servers – used the ramp of the x86 and now the x64 server as proof that Intel knew how to create standards and was therefore justified in being the one and only technical advisor (and non-voting member) of the ODCA.

Skaugen walked down memory lane, reminding everyone that in the early 1990s when the Pentium Pro chip was announced, heralding a new era in server computing, Intel had a very tiny share of the server racket, but by 1995, the market was consuming 1 million boxes with Intel having under 10 per cent share. By 2000, thanks to the dot-com buildout and the ascendancy of Linux for Webby infrastructure and supercomputing, the market had grown to 4 million units.

And with Intel bringing together operating system and other software and hardware players, the market is now at 7 million units according to Skaugen (more like 8 million, really, until the virtualization blowback kicks in) with Intel having the vast majority of shipments. It averages somewhere between 96 and 97 per cent most quarters. Skaugen said that nine out of ten systems running on clouds today used a Xeon processor, and brought up the future "Sandy Bridge" Xeons and referred to them as the "foundation of the next-generation cloud."

What Skaugen did not say is that this decade and a half of x86 and x64 server sales created the problem that required server virtualization in the first place. Had proper Unix operating systems and sophisticated workload management tools come to market along with ever-improving Intel chips, then server utilization would have been a lot higher and Intel a lot poorer.

Companies may have saved money on iron by moving off proprietary mainframes and minis, and then Unix boxes, but they ended up paying for it with low server utilization, soaring data center costs, high software licensing fees, and so on. And now, with server virtualization, companies are trying to get back to the good-old-days of a centralized and virtualized utility to support applications. (Don't get me wrong. Unix needed virtualization, too, as did the proprietary OS/400 and OpenVMS operating systems. They don't run flat out all day, either. But they did a lot better than Linux and Windows).

The fact that x64 chips dominate the server chip market does not mean that they're a standard, however much Intel might want it to be so. There's no open spec for chip and system design. There is no community steering group. Sure, we have USB and PCI-Express peripheral standards, memory module standards, disk form factor and rack form factor standards, and all kinds of other standards. But there is no way for Intel's or AMD's customers, partners, and rivals to have a say in the future of the x64 server platform. Did Intel ask you what you wanted in its next chips? No. You waited to see what it would do next, just like the rest of us.

The Power of One eBook: Top reasons to choose HP BladeSystem

Next page: On the other ARM

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
SHOCK and AWS: The fall of Amazon's deflationary cloud
Just as Jeff Bezos did to books and CDs, Amazon's rivals are now doing to it
BlackBerry: Toss the server, mate... BES is in the CLOUD now
BlackBerry Enterprise Services takes aim at SMEs - but there's a catch
The triumph of VVOL: Everyone's jumping into bed with VMware
'Bandwagon'? Yes, we're on it and so what, say big dogs
Carbon tax repeal won't see data centre operators cut prices
Rackspace says electricity isn't a major cost, Equinix promises 'no levy'
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.