Feeds

Intel plays Switzerland in the cloud wars

Who will bear the ARM 'standard'?

Remote control for virtualized desktops

Comment Whenever someone starts waving "standards," it is always a prelude to war. With the launch of the Open Data Center Alliance today by 70 IT organizations (some of whom are IT suppliers), Intel is trying to position itself as the neutral player in the coming cloud wars. Switzerland benefited by being the bankers for warring countries, and Intel seeks to benefit by maintaining and extending its dominance in the server racket.

In the computer industry, a standard is a bit of a double-edged sword. It is meant to be some kind of peace offering, a compromise between warring factions who argue over how different hardware and software components plug into each other or how people and devices talk to software running on a system (that's the cooperative edge. But at the same time, a standard is brandished like a weapon (that's the competitive edge) to wound other players and chase them away from the pile of money in the data center or on the desktop.

No one ever argues against standards, of course. It is a bit like arguing against world peace or trying to persuade everyone that they are not entitled to life, liberty, and the pursuit of happiness. But standards in the computer business - real standards, developed cooperatively, endorsed by vendors, and supported by the budget dollars of end users - are hard to come by.

More times than not, the standard is set by the last man standing in a particular market. That's how we got TCP/IP instead of myriad other network protocols. That's why you can't kill Ethernet no matter how hard you try. And that's also why there still is not a Unix standard, or blade form factor server standards, or even virtual machine standards - and there never will be. The "might makes right" aspects of standards is why people refer to x64-based servers as "industry standard servers" when what they really mean to say is "volume servers that have crushed most other platforms out of existence or driven them into legacy status."

In fact, at the launch of the Open Data Center Alliance in San Francisco today, Kirk Skaugen, general manager of Intel's Data Center Group – which makes chips and chipsets for PCs and servers – used the ramp of the x86 and now the x64 server as proof that Intel knew how to create standards and was therefore justified in being the one and only technical advisor (and non-voting member) of the ODCA.

Skaugen walked down memory lane, reminding everyone that in the early 1990s when the Pentium Pro chip was announced, heralding a new era in server computing, Intel had a very tiny share of the server racket, but by 1995, the market was consuming 1 million boxes with Intel having under 10 per cent share. By 2000, thanks to the dot-com buildout and the ascendancy of Linux for Webby infrastructure and supercomputing, the market had grown to 4 million units.

And with Intel bringing together operating system and other software and hardware players, the market is now at 7 million units according to Skaugen (more like 8 million, really, until the virtualization blowback kicks in) with Intel having the vast majority of shipments. It averages somewhere between 96 and 97 per cent most quarters. Skaugen said that nine out of ten systems running on clouds today used a Xeon processor, and brought up the future "Sandy Bridge" Xeons and referred to them as the "foundation of the next-generation cloud."

What Skaugen did not say is that this decade and a half of x86 and x64 server sales created the problem that required server virtualization in the first place. Had proper Unix operating systems and sophisticated workload management tools come to market along with ever-improving Intel chips, then server utilization would have been a lot higher and Intel a lot poorer.

Companies may have saved money on iron by moving off proprietary mainframes and minis, and then Unix boxes, but they ended up paying for it with low server utilization, soaring data center costs, high software licensing fees, and so on. And now, with server virtualization, companies are trying to get back to the good-old-days of a centralized and virtualized utility to support applications. (Don't get me wrong. Unix needed virtualization, too, as did the proprietary OS/400 and OpenVMS operating systems. They don't run flat out all day, either. But they did a lot better than Linux and Windows).

The fact that x64 chips dominate the server chip market does not mean that they're a standard, however much Intel might want it to be so. There's no open spec for chip and system design. There is no community steering group. Sure, we have USB and PCI-Express peripheral standards, memory module standards, disk form factor and rack form factor standards, and all kinds of other standards. But there is no way for Intel's or AMD's customers, partners, and rivals to have a say in the future of the x64 server platform. Did Intel ask you what you wanted in its next chips? No. You waited to see what it would do next, just like the rest of us.

Internet Security Threat Report 2014

Next page: On the other ARM

More from The Register

next story
Just don't blame Bono! Apple iTunes music sales PLUMMET
Cupertino revenue hit by cheapo downloads, says report
The DRUGSTORES DON'T WORK, CVS makes IT WORSE ... for Apple Pay
Goog Wallet apparently also spurned in NFC lockdown
Cray-cray Met Office spaffs £97m on VERY AVERAGE HPC box
Only 250th most powerful in the world? Bring back Michael Fish
Microsoft brings the CLOUD that GOES ON FOREVER
Sky's the limit with unrestricted space in the cloud
'ANYTHING BUT STABLE' Netflix suffers BIG Europe-wide outage
Friday night LIVE? Nope. The only thing streaming are tears down my face
IBM, backing away from hardware? NEVER!
Don't be so sure, so-surers
Google roolz! Nest buys Revolv, KILLS new sales of home hub
Take my temperature, I'm feeling a little bit dizzy
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
New hybrid storage solutions
Tackling data challenges through emerging hybrid storage solutions that enable optimum database performance whilst managing costs and increasingly large data stores.
Mitigating web security risk with SSL certificates
Web-based systems are essential tools for running business processes and delivering services to customers.