Feeds

Windows Server 2012: Smarter, stronger, frustrating

Perfect upgrade for punters with a passion for the obscure

Top 5 reasons to deploy VMware with Tegile

Review Microsoft has released Windows Server 2012, based on the same core code as Windows 8. Yes, it has the same Start screen in place of the Start menu, but that is of little importance, particularly since Microsoft is pushing the idea of installing the Server Core edition – which has no Graphical User Interface. If you do install a GUI, Server 2012 even boots into the desktop by default.

This is a big release. The server team had no need to reimagine Windows, giving them a clear run to focus on product features, not least the attempt to catch up with VMware in the virtualisation stakes with a greatly updated Hyper-V. The list of what’s new is long and tedious, but what is most significant is the way Windows Server is evolving away from its origins as a server variant of a monolithic GUI operating system.

Two key features that underpin Server 2012 are modularity and automation. Neither is yet perfect, but this release is where they start to look convincing. Evidence of modularity is that you can now move between Server Core, which has only a command prompt, and the full GUI edition by adding and removing features, whereas before you would have to reinstall.

There are still some odd dependencies. If you add the Application Server role to a Core installation, it requires the GUI management tools to be installed, for example. Still, improved modularity is important since it means installing only what you need, which is good for both performance and security.

Progress in automation is even more noticeable. It may be significant that the lead architect for Windows Server is Jeffrey Snover, who is also the inventor of PowerShell, Microsoft’s scripting platform for Windows administration based on the .NET Framework. PowerShell has hundreds of new Cmdlets (installable PowerShell commands), is designed to run remotely, and has a new workflow engine. There is now a full set of Cmdlets for Hyper-V.

PowerShell History shows you scripts generated by actions in the GUI

The new Server Manager is in many cases a wrapper for PowerShell, something that will be familiar to Exchange 2010 administrators. Better still, the Active Directory Administrative Center has a PowerShell History pane that shows you the script generated by your actions in the GUI, so that you can copy and modify for future actions. The PowerShell editor, the Integrated Scripting Environment, now supports collapsible regions and IntelliSense code completion.

Server Manager itself is completely redone in this release. It is now a tool for managing multiple servers, and you can view your server infrastructure by role as well as by server. The idea of the Metro-inspired dashboard is that green means good, while red demands attention. From the Server Manager, you can easily view the event logs and performance data for each server, as well as accessing all the management and configuration tools such as adding and removing features, services, device manager, storage management, PowerShell prompt, and, if you need it, remote desktop.

Green is good, red means trouble: the Server Manager

This is great stuff, but in practice old Windows enemies can still haunt the administration experience. I set up three instances of Server 2012 in a domain for testing: one physical and two virtual. One of these servers gives an error when added to Server Manager, filling it with red blotches. The error is “Cannot get event data,” and I wasted some time trying to find the reason for the problem. It is related to a DCOM (Distributed COM) error 2147944122. The detail of this is supremely unimportant; the point is that Windows administrators spend too much time investigating obscurities like this when they would rather be using lovely GUI management tools.

That said, most of the operations I tried with the RTM (Release to Manufacture) build of Server 2012 have worked exactly as advertised.

Storage Spaces is a new way to manage hard drives, aimed at smaller organisations who lack the luxury of a Storage Area Network (SAN). The feature lets you define a storage pool across several physical drives, and then create virtual disks within the pool. A virtual disk can be resilient, supporting either mirroring – where each disk is duplicated – or parity striping, which is more efficient but requires three or more drives.

Beginner's guide to SSL certificates

More from The Register

next story
It's Big, it's Blue... it's simply FABLESS! IBM's chip-free future
Or why the reversal of globalisation ain't gonna 'appen
IBM storage revenues sink: 'We are disappointed,' says CEO
Time to put the storage biz up for sale?
'Hmm, why CAN'T I run a water pipe through that rack of media servers?'
Leaving Las Vegas for Armenia kludging and Dubai dune bashing
Microsoft and Dell’s cloud in a box: Instant Azure for the data centre
A less painful way to run Microsoft’s private cloud
Facebook slurps 'paste sites' for STOLEN passwords, sprinkles on hash and salt
Zuck's ad empire DOESN'T see details in plain text. Phew!
Windows 10: Forget Cloudobile, put Security and Privacy First
But - dammit - It would be insane to say 'don't collect, because NSA'
prev story

Whitepapers

Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.