All I want for Christmas is ...

David Norfolk puts on his Santa virtualisation suit...

7 Elements of Radically Simple OS Migration

So, what is the good little developer getting for Christmas? If you didn't overflow any buffers this year (well, not many anyway) and followed all the “best practices” that nice Mr Gates told you about and didn't play about with nasty little ragamuffins like Linus, then you might get a nice shiny new workstation in your stocking.

It'll probably be using one of the 64 bit chips the Xmas Elves at Intel World are turning out, since Intel announced 64-bit dual-core Xeon in plenty of time for Xmas. That nice Mr Dell has turned them into something that'll run transactions at less than a dollar a time which sounds nice. Well, it does if he hasn't cut too many corners doing it - and I wonder if he bought his hardware at RRP?

But why would you want 64 bits? Can you count up that big? Well bigger numbers are always better, of course and Chris Furlong of QlikTech points out that Intel 32 bit chips tend to be just a little broken with more than 3-4 Gbytes of RAM– and as his real-time BI product builds, essentially, an in-memory database, it simply flies with 64 bits.

It's not the only product using in-memory data structures, of course, and Martin Richmond-Coggan, VP EMEA for Applix, points out that Applix TM1 has been exploiting 64 bits nicely on UNIX for some time (but we promised Santa not to mention UNIX). The cheapness of RAM has been a real Christmas present for people who weren't enthusiastic about virtual memory (a lesson for all of us – "best practice" changes if the environment changes), just in time for some very large applications to appear that can exploit all the RFID data that is promised to arrive any day now.

But for many of us, 64 bits may just be a bit of a status symbol for a while - although Aldi, the supermarket, has already sold dual-core Pentium base stations to the general public for as little as £749.99.

However, perhaps we'll want the new chips anyway, because things are getting really hot out there – as in frying eggs - and electricity (and air conditioning) costs money. The new multicore chips behave as efficient multiCPU chips, which means that with luck you can crank down the GHz a bit for the same throughput, which cools things down and needs less power. Coooooool... Well, not luck exactly; this needs developers to use “best practices” that allow things to multitask or multithread for greater throughput (but how many developers are really happy with multi-threading applications).

Nevertheless, are we wise giving such powerful toys to innocent programmers? Most people use about 10 per cent of the functionality of their software (hopefully a different 10 per cent each time) and most CPUs on people's desks are idle most of the time. Most computing environments aren't like the mainframe with a job scheduler (such as Cybermation can supply), happily (and reliably) running above 90 per cent utilisation.

Most people, games players excepted, aren't going to welcome the chance to buy a new PC, just because the programs hanging from their Xmas tree crawl rather than run with less than about 10 Gbytes of cache. There's a reasonable view that a software vendor might have more happy customers (and, thus, more secure developers) if it gives its developers the slowest machine its paying customers might want to use, not the fastest.

Perhaps it should buy them the fastest machines as real Xmas presents, just to take home to play games on. Or, since working 24x7 seems to be the new business ethic, put such a machine in the coffee room, for use out of hours. Then, when something goes wrong, the programmer responsible may be on site....

OK, so perhaps I've been at the eggnog already, but “faster, better, newer...” is beginning to get on my nerves. At Intel's dual core Xeon launch, Tikiri Wanduragala of IBM was just about the only speaker who wasn't just drooling over the technology; he came up with a real business message that recognised that not everyone needed the latest and fastest. And, of course, the OTT new machine I gave myself for my birthday isn't being used while I install all the outstanding patches available for a Media Centre PC running Windows XP Pro and find out why the ones that won't install are being uncooperative, and this isn't improving my temper either....

In the meantime, I am actually working on a Windows 98 machine and, although the word processor has lots of functionality to mop up what little power a 500 MHz Pentium 3 has, mere word processing isn't actually any faster on the new 3.4 GHz Pentium 4 machine I've just mentioned. When I come to think of it, word processing wasn't much slower running on MS DOS on a PC/AT. Perhaps Bill Gates needs to buy me a really fast machine as an Xmas present; before I start thinking that well-designed DOS-based software wasn't so bad after all

But hold on: I have thought of a way for a developer to justify a super machine for Xmas. Run virtualisation software on it (from VMWare, say) and you can simulate all the boring old legacy PCs your poor customers actually use, at the same time.

You can even develop for something like OS/2, if that still makes sense, even though I doubt that OS/2 will install native on anything you can actually buy today, because you can emulate a perfect 1995-era PC just for OS/2.

Well, you can if your trusty old PC can cope with the load and you are confident of current Intel chips' ability to copy with true virtualisation (Intel has only just started to deliver full hardware support for virtualisation with "Intel® Virtualization Technology" – what an imaginative name - which may help reduce the performance impact of virtualisation); and you're really sure that your emulation is accurate...

To be fair, PC virtualisation seems to be rock solid today, although Richard Garsthagen of VMWare tells me that for real production work, it has its own microkernel OS specialised for virtualisation and that this will be able to take full advantage of Intel's new technologies. The trick, as people with mainframe experience will know, is to keep the platform management channels open and freely accessible (i.e. responding fast) while some virtual partition is occupying resources and misbehaving in ways you haven't anticipated.

However, the bottom line is that you might just get that new workstation for Xmas after all – and you might even need it. But I didn't say which Xmas... ®

David Norfolk is the author of IT Governance, published by Thorogood. More details here.

Best practices for enterprise data

More from The Register

next story
Sysadmin Day 2014: Quick, there's still time to get the beers in
He walked over the broken glass, killed the thugs... and er... reconnected the cables*
VMware builds product executables on 50 Mac Minis
And goes to the Genius Bar for support
Multipath TCP speeds up the internet so much that security breaks
Black Hat research says proposed protocol will bork network probes, flummox firewalls
Auntie remains MYSTIFIED by that weekend BBC iPlayer and website outage
Still doing 'forensics' on the caching layer – Beeb digi wonk
Microsoft's Euro cloud darkens: US FEDS can dig into foreign servers
They're not emails, they're business records, says court
Microsoft says 'weird things' can happen during Windows Server 2003 migrations
Fix coming for bug that makes Kerberos croak when you run two domain controllers
Cisco says network virtualisation won't pay off everywhere
Another sign of strain in the Borg/VMware relationship?
prev story


7 Elements of Radically Simple OS Migration
Avoid the typical headaches of OS migration during your next project by learning about 7 elements of radically simple OS migration.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Solving today's distributed Big Data backup challenges
Enable IT efficiency and allow a firm to access and reuse corporate information for competitive advantage, ultimately changing business outcomes.
A new approach to endpoint data protection
What is the best way to ensure comprehensive visibility, management, and control of information on both company-owned and employee-owned devices?