Feeds

Will Whistler be a hardware hog?

Do DLLs overwrite one another in the woods?

  • alert
  • submit to reddit

The essential guide to IT transformation

Microsoft has released an interim build of Whistler to testers, having knocked back the actual release of beta 1 of the software two weeks, to 25th October. According to Paul Thurrott of WinInfo, who's hand a chance to look at a copy of the build (2267), there are a few small improvements over the previous build, but no major new features. Two things do however seem particularly worth noting - there's a new policy whereby testers have to download or install code live using Passport validation, and according to Thurrott the new Whistler UI is a severe hardware hit.

With beta 1 going out shortly to a wider audience, the extra control over distribution of 2267 really doesn't have a great deal of significance. Microsoft has suffered from a couple of escaped builds in the recent past, but it's future test code, nott Whistler, that it'll be most concerned to control.

There is however another aspect to consider. Whistler is going to be the first .NET OS, says Microsoft, and with .NET the company is beginning the move away from shrinkwrap and toward a Web-based service model (Steve Ballmer told us so it must be true). It's obviously not going to be possible to sell a whole operating system over the Web and get users to install it via Windows Update by the time Whistler ships next year, but it seems pretty clear that the Whistler testers are going to end up testing the distribution, validation and authentication services that will be used when .NET gets more mature.

And the performance hit? The new skinnable UI is obviously still under development, so in theory it could get faster during the beta process, but in practice Microsoft's new features have a tendency to get zipped up by having more hardware thrown at them, rather than via code optimisation. Thurrott says he's been told Microsoft is going to be more upfront than previously about the hardware requirements for Whistler, and says he wouldn't be surprised if these included 128 megabytes RAM minimum.

But we foresee a slight problemette here. The current version of the Wintel PC2001 roadmap specifies 128 for Win2k machines, otherwise 64. But if Whistler is shipping as the successor both to Win2k and to WinME, well, is the answer 64 or 128? And shouldn't Microsoft tell the OEMs so they can plan their RAM purchases? WinInfo has quite a bit more to say about 2267, and you can read that here. ®

The essential guide to IT transformation

More from The Register

next story
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
This is how I set about making a fortune with my own startup
Would you leave your well-paid job to chase your dream?
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.