All I want for Christmas is ...

David Norfolk puts on his Santa virtualisation suit...

Top 5 reasons to deploy VMware with Tegile

So, what is the good little developer getting for Christmas? If you didn't overflow any buffers this year (well, not many anyway) and followed all the “best practices” that nice Mr Gates told you about and didn't play about with nasty little ragamuffins like Linus, then you might get a nice shiny new workstation in your stocking.

It'll probably be using one of the 64 bit chips the Xmas Elves at Intel World are turning out, since Intel announced 64-bit dual-core Xeon in plenty of time for Xmas. That nice Mr Dell has turned them into something that'll run transactions at less than a dollar a time which sounds nice. Well, it does if he hasn't cut too many corners doing it - and I wonder if he bought his hardware at RRP?

But why would you want 64 bits? Can you count up that big? Well bigger numbers are always better, of course and Chris Furlong of QlikTech points out that Intel 32 bit chips tend to be just a little broken with more than 3-4 Gbytes of RAM– and as his real-time BI product builds, essentially, an in-memory database, it simply flies with 64 bits.

It's not the only product using in-memory data structures, of course, and Martin Richmond-Coggan, VP EMEA for Applix, points out that Applix TM1 has been exploiting 64 bits nicely on UNIX for some time (but we promised Santa not to mention UNIX). The cheapness of RAM has been a real Christmas present for people who weren't enthusiastic about virtual memory (a lesson for all of us – "best practice" changes if the environment changes), just in time for some very large applications to appear that can exploit all the RFID data that is promised to arrive any day now.

But for many of us, 64 bits may just be a bit of a status symbol for a while - although Aldi, the supermarket, has already sold dual-core Pentium base stations to the general public for as little as £749.99.

However, perhaps we'll want the new chips anyway, because things are getting really hot out there – as in frying eggs - and electricity (and air conditioning) costs money. The new multicore chips behave as efficient multiCPU chips, which means that with luck you can crank down the GHz a bit for the same throughput, which cools things down and needs less power. Coooooool... Well, not luck exactly; this needs developers to use “best practices” that allow things to multitask or multithread for greater throughput (but how many developers are really happy with multi-threading applications).

Nevertheless, are we wise giving such powerful toys to innocent programmers? Most people use about 10 per cent of the functionality of their software (hopefully a different 10 per cent each time) and most CPUs on people's desks are idle most of the time. Most computing environments aren't like the mainframe with a job scheduler (such as Cybermation can supply), happily (and reliably) running above 90 per cent utilisation.

Most people, games players excepted, aren't going to welcome the chance to buy a new PC, just because the programs hanging from their Xmas tree crawl rather than run with less than about 10 Gbytes of cache. There's a reasonable view that a software vendor might have more happy customers (and, thus, more secure developers) if it gives its developers the slowest machine its paying customers might want to use, not the fastest.

Perhaps it should buy them the fastest machines as real Xmas presents, just to take home to play games on. Or, since working 24x7 seems to be the new business ethic, put such a machine in the coffee room, for use out of hours. Then, when something goes wrong, the programmer responsible may be on site....

OK, so perhaps I've been at the eggnog already, but “faster, better, newer...” is beginning to get on my nerves. At Intel's dual core Xeon launch, Tikiri Wanduragala of IBM was just about the only speaker who wasn't just drooling over the technology; he came up with a real business message that recognised that not everyone needed the latest and fastest. And, of course, the OTT new machine I gave myself for my birthday isn't being used while I install all the outstanding patches available for a Media Centre PC running Windows XP Pro and find out why the ones that won't install are being uncooperative, and this isn't improving my temper either....

In the meantime, I am actually working on a Windows 98 machine and, although the word processor has lots of functionality to mop up what little power a 500 MHz Pentium 3 has, mere word processing isn't actually any faster on the new 3.4 GHz Pentium 4 machine I've just mentioned. When I come to think of it, word processing wasn't much slower running on MS DOS on a PC/AT. Perhaps Bill Gates needs to buy me a really fast machine as an Xmas present; before I start thinking that well-designed DOS-based software wasn't so bad after all

But hold on: I have thought of a way for a developer to justify a super machine for Xmas. Run virtualisation software on it (from VMWare, say) and you can simulate all the boring old legacy PCs your poor customers actually use, at the same time.

You can even develop for something like OS/2, if that still makes sense, even though I doubt that OS/2 will install native on anything you can actually buy today, because you can emulate a perfect 1995-era PC just for OS/2.

Well, you can if your trusty old PC can cope with the load and you are confident of current Intel chips' ability to copy with true virtualisation (Intel has only just started to deliver full hardware support for virtualisation with "Intel® Virtualization Technology" – what an imaginative name - which may help reduce the performance impact of virtualisation); and you're really sure that your emulation is accurate...

To be fair, PC virtualisation seems to be rock solid today, although Richard Garsthagen of VMWare tells me that for real production work, it has its own microkernel OS specialised for virtualisation and that this will be able to take full advantage of Intel's new technologies. The trick, as people with mainframe experience will know, is to keep the platform management channels open and freely accessible (i.e. responding fast) while some virtual partition is occupying resources and misbehaving in ways you haven't anticipated.

However, the bottom line is that you might just get that new workstation for Xmas after all – and you might even need it. But I didn't say which Xmas... ®

David Norfolk is the author of IT Governance, published by Thorogood. More details here.

Intelligent flash storage arrays

More from The Register

next story
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
Trio of XSS turns attackers into admins
SanDisk vows: We'll have a 16TB SSD WHOPPER by 2016
Flash WORM has a serious use for archived photos and videos
prev story


Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
10 threats to successful enterprise endpoint backup
10 threats to a successful backup including issues with BYOD, slow backups and ineffective security.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?