Feeds

Windows Server 2012: Smarter, stronger, frustrating

Perfect upgrade for punters with a passion for the obscure

Next gen security for virtualised datacentres

Review Microsoft has released Windows Server 2012, based on the same core code as Windows 8. Yes, it has the same Start screen in place of the Start menu, but that is of little importance, particularly since Microsoft is pushing the idea of installing the Server Core edition – which has no Graphical User Interface. If you do install a GUI, Server 2012 even boots into the desktop by default.

This is a big release. The server team had no need to reimagine Windows, giving them a clear run to focus on product features, not least the attempt to catch up with VMware in the virtualisation stakes with a greatly updated Hyper-V. The list of what’s new is long and tedious, but what is most significant is the way Windows Server is evolving away from its origins as a server variant of a monolithic GUI operating system.

Two key features that underpin Server 2012 are modularity and automation. Neither is yet perfect, but this release is where they start to look convincing. Evidence of modularity is that you can now move between Server Core, which has only a command prompt, and the full GUI edition by adding and removing features, whereas before you would have to reinstall.

There are still some odd dependencies. If you add the Application Server role to a Core installation, it requires the GUI management tools to be installed, for example. Still, improved modularity is important since it means installing only what you need, which is good for both performance and security.

Progress in automation is even more noticeable. It may be significant that the lead architect for Windows Server is Jeffrey Snover, who is also the inventor of PowerShell, Microsoft’s scripting platform for Windows administration based on the .NET Framework. PowerShell has hundreds of new Cmdlets (installable PowerShell commands), is designed to run remotely, and has a new workflow engine. There is now a full set of Cmdlets for Hyper-V.

PowerShell History shows you scripts generated by actions in the GUI

The new Server Manager is in many cases a wrapper for PowerShell, something that will be familiar to Exchange 2010 administrators. Better still, the Active Directory Administrative Center has a PowerShell History pane that shows you the script generated by your actions in the GUI, so that you can copy and modify for future actions. The PowerShell editor, the Integrated Scripting Environment, now supports collapsible regions and IntelliSense code completion.

Server Manager itself is completely redone in this release. It is now a tool for managing multiple servers, and you can view your server infrastructure by role as well as by server. The idea of the Metro-inspired dashboard is that green means good, while red demands attention. From the Server Manager, you can easily view the event logs and performance data for each server, as well as accessing all the management and configuration tools such as adding and removing features, services, device manager, storage management, PowerShell prompt, and, if you need it, remote desktop.

Green is good, red means trouble: the Server Manager

This is great stuff, but in practice old Windows enemies can still haunt the administration experience. I set up three instances of Server 2012 in a domain for testing: one physical and two virtual. One of these servers gives an error when added to Server Manager, filling it with red blotches. The error is “Cannot get event data,” and I wasted some time trying to find the reason for the problem. It is related to a DCOM (Distributed COM) error 2147944122. The detail of this is supremely unimportant; the point is that Windows administrators spend too much time investigating obscurities like this when they would rather be using lovely GUI management tools.

That said, most of the operations I tried with the RTM (Release to Manufacture) build of Server 2012 have worked exactly as advertised.

Storage Spaces is a new way to manage hard drives, aimed at smaller organisations who lack the luxury of a Storage Area Network (SAN). The feature lets you define a storage pool across several physical drives, and then create virtual disks within the pool. A virtual disk can be resilient, supporting either mirroring – where each disk is duplicated – or parity striping, which is more efficient but requires three or more drives.

Secure remote control for conventional and virtual desktops

More from The Register

next story
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
Like condoms, data now comes in big and HUGE sizes
Linux Foundation lights a fire under storage devs with new conference
Community chest: Storage firms need to pay open-source debts
Samba implementation? Time to get some devs on the job
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?