Feeds

SMT Xeons count double for Win2k Server licences

No gain without pain...

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

The introduction of Intel's new Hyper-Threading technology is providing Microsoft with a handy mechanism for getting more money from Windows 2000 Server customers, for encouraging users to switch over to Windows .NET Server, or both. It kind of depends on how .NET Server pricing pans out, but as far as Win2k Server goes it's already clear that people wanting to use the extra oomph in the new Xeons are going to have to stump up.

According to a Microsoft backgrounder available here, Win2k Server handles Hyper-Threading by using the processor count from the machine's bios. "For example, when you launch Windows 2000 Server (4-CPU limit) on a four-way system enabled with Hyper-Threading Technology, Windows will use the first logical processor on each of the four physical processors, as shown in Figure 2; the second logical processor on each physical processor will be unused, because of the 4-CPU license limit. (This assumes the BIOS was written according to Intel specifications. Windows uses the processor count and sequence indicated by the BIOS.)"

If you're using Win2k Server with an 8-CPU limit on the same configuration, it will then use all eight logical processors. So the two logical processors on each Xeon processor are being accounted for as two separate processors, meaning a two way counts as a four way, a 4-CPU licence running on a four way isn't actually using the four second logical processors... It'll surely cause a deal of confusion for people proposing to deploy Win2k Server on Hyper-Threading systems without having figured out the licensing implications first.

Ominously, Windows .NET Server presents itself as something of a bargain here, mitigated more than a little by it not having a price tag yet, on account of it still being in beta. .NET Server can distinguish between logical and physical processors, whatever the BIOS says, and even more ominously, says the backgrounder: "This provides a powerful advantage over Windows 2000, in that Windows .NET Server only treats physical processors as counting against the license limit."

Call us bitter old cynics if you like, but when we hear Microsoft talking about powerful advantages of next generation software we hide our chequebooks. .NET Standard Server, which has a 2-CPU limit, will use all four logical CPUs in a 2-CPU Hyper-Threading system. Which is maybe nice, except that the upper limit for .NET Standard Server is two CPUs, next stop being Enterprise Server with up to eight, then Datacenter Server beyond that. Win2k Server, on the other hand, goes up to four CPUs before you wind up having to trade up to Advanced Server.

Until we have a pricing structure for .NET Server it won't be absolutely clear whether Microsoft is mainly trying to push the new software, make more money out of the same sales levels, or boost sales of Windows on commoditised twin CPU servers. But it's probably going to be a mix of all of these.

Nor is it clear how much of the difference between the way the two operating systems handle Hyper-Threading is down to technical issues and how much to bean-counting. According to Intel's bios spec, the first logical processor on each physical processor should be tallied-off first, then the second ones should be counted. According to Microsoft, if this is not the case, "Windows 2000 or its applications may use logical processors when they should be using physical processors instead... Such an application will achieve better performance using two separate physical processors (such as 1 and 2) than it would using two logical processors on the same physical processor (such as 1 and 5)."

If it's simply a matter of bean-counting then it would make sense (to users, anyway) for a bean-counting patch for Win2k Server to be issued, thus bringing it up to snuff with .NET Server as far as Hyper-Threading is concerned. If it's technical, then this probably wouldn't get you very far, and Microsoft is as yet uninformative on the subject.

"At the time of this publication," says the document, "there is insufficient data to generalize the performance of Windows on systems enabled with Hyper-Threading Technology. However, it is safe to assume performance will vary depending on the application, system configuration, and version of Windows that is used." Well, yes, that would seem logical, wouldn't it?

It continues: "Although Windows 2000 is compatible with Hyper-Threading Technology, we expect customers will get the best performance from Hyper-Threading Technology using Windows .NET Server. This is because the Windows .NET Server Family is engineered to take full advantage of the logical processors created by Hyper-Threading Technology. Microsoft expects to see positive performance gains with Windows .NET Server and Hyper-Threading Technology, while Windows 2000 performance gains are expected to be more modest."

In what sense, and to what extent, it is so engineered is not made clear. And we can't help noticing Microsoft did just "generalize the performance of Windows on systems enabled with Hyper-Threading Technology" there. Ahem... ®

Related stories:
No prisoners: Intel unwraps SMT Xeon

Secure remote control for conventional and virtual desktops

More from The Register

next story
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
'Stop dissing Google or quit': OK, I quit, says Code Club co-founder
And now a message from our sponsors: 'STFU or else'
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Uber, Lyft and cutting corners: The true face of the Sharing Economy
Casual labour and tired ideas = not really web-tastic
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?