Feeds

Windows NT: Remember Microsoft's almost perfect 20-year-old?

It all went a bit pear-shaped later... but it DID stop people switching to OS/2

Mobile application security vulnerability report

Feature If you want to be reminded that you're getting old, ask a youngster what Windows NT is. Chances are, there'll be blank looks all round. Windows What? Is it, like, a codename for a new version?

You can't blame them. There hasn't actually been a proper "Windows NT" release since the late 1990s, so for almost anyone under 30 it's an anachronism. I've checked. For anyone old enough to remember the OS wars of 1990 to 1995, Windows 8, 7, Vista, XP are still "NT", no matter what the Microsoft marketing department calls it.

NT, first released in 1993, really has four phases in its history: the FUD phase, before it was launched; the brief few years when it was almost perfect - and nobody used it; then a long period of mismanagement and decline; and then, more recently, the WinMin and Metro era. I'd venture that the first two were the most important, and I got a closer look at it than most.

Two decades ago, in another life, I was beta-testing NT months before it was released, for the mighty Digital Equipment Corporation (DEC). Or I should say, the "still-quite-mighty-but-falling-fast" DEC. My employer, a scientific instruments supplier, had made DEC lots of money from its clever VAX-based kit, and so was treated to software and hardware long before it became public. This workstation had the new Alpha chip, which everybody knew was the dog's bollocks. Alpha and NT: this looked like the future. It was a shame that for weeks the lack of a keyboard driver meant it could only boot to a BSOD.

In 1992 NT was, for any professional developer, the mighty juggernaut that you couldn't avoid. You'd been hip to Unix-flavoured systems at college, or maybe even taught yourself how to use them in your own time. But NT threatened all that hard won know-how.

In the unwritten taxonomy of technology companies, Microsoft was still firmly in the comedy category for many of us. Evidently driven by marketing rather than technical excellence, for years it had coat-tailed on much bigger outfits - first IBM and then others, via 1991's short-lived and speculative Advanced Computing Environment consortium – comprised of Compaq, Microsoft, MIPS Computer Systems, DEC and the Santa Cruz Operation.

Windows

Microsoft's position owed everything to a ludicrous but ubiquitous business software product: MS-DOS. And yet Microsoft had contrived, by 1987, with Windows 2.0, to take this primitive OS – barely an OS, really – and make it even slower and buggier.

Microsoft had portfolio breadth but not quality. There was a clutch of so-so applications and so-so development tools. Enthusiasts preferred Borland's Turbo products while pros who wanted performance opted for Watcom's C compiler, the fastest out there. Microsoft was universally seen as holding the industry back. At least, that was the received wisdom among my *ix-savvy brethren.

But we all saw how the Great Industry Powers squabbled over the Unix world and created a great vacuum, and NT threatened to fill in that gap. Unix was then (and still is) an idea almost anyone can implement. And lots of people do. And we knew this first hand, due to considerable time spent ensuring our builds on HP-UX, Ultrix, OSF/1 and AIX all succeeded on various bits of hardware. Every developer knew their processor endians. (The Register's first slogan in 1994 was "the only good endian is a dead endian".)

My, how this made tracking down bugs so much more fun. Honestly. But worst of all was the time required to produce something workable when a squabbling Unix industry couldn't. Developers today use Qt to create sophisticated cross-platform applications – such as Skype or Google Earth – that work nicely across Linux, Mac and Windows. That's because the target platforms are themselves are rich, mature and (generally) stable. But back then, even attempting a basic GUI that worked "cross-Unix" was difficult, and the end result might as well have been modem noise piped into a frame-buffer.

Windows NT 4 login screen

Nostalgia ... Who else immediately played the boot-up jingle in their heads upon seeing this?

With warring factions unable to agree on standards and interfaces, the lowest common denominators were (no pun intended) primitive. The X Window system and the most basic X toolkits spawned thousands of pages of documentation – which incidentally gave "technology-transfer" kingpin Tim O'Reilly his big break – merely to create a simple widget, such as a tickbox. Each "cure" was a design-by-committee atrocity.

One API to rule them all

NT was built to be scalable, processor-independent, reasonably secure, and with a rich GUI. And it had one API to rule them, which meant everyone could see what NT could offer.

Every major industry vendor bar Sun promised a port; it would run on Intel, MIPS, PowerPC, PA RISC and Alpha. Microsoft published the Win32 specs in the early summer of 1992, a kind of firing pistol. One senior Linux figure today told me that by 1993, every crack Unix dev he knew in the Bay Area was secretly cribbing up on the Win32 APIs in Windows NT. I have no trouble believing him, it was no different in the UK. And Windows NT promised to run everywhere.

Microsoft had never come up with anything "grown-up" before, and spent much of the time alluding to the fact it was "VMS improved" (referring to DEC's VMS operating system). They even invented some retrospective mythology - that if you added the next letter in the alphabet to each letter in the acronym VMS (Virtual Memory System), you would have WNT. In fact, NT owes its name to the code name for the chip it was designed on: "N-Ten", the nickname of the Intel i860 XR processor. The mythology was invented to impress journalists. (Nowadays, it means "New Technology".)

Six of the original seven NT engineers were VMS architects, but most of NT had nothing to do with VMS daddy Dave Cutler's kernel team - and instead featured layers of code ripped wholesale from Windows and OS/2. Architectural compromises would take it a long way from VMS.

As I wrote here, recalling the OS wars, the main value of NT in its first few years was as a propaganda bunker-buster. From 1992 to 1994 it was used to stop people switching to Unix or OS/2. There were barely any applications. Performance on Intel chips was, to put it kindly, "stately".

The Power of One Infographic

More from The Register

next story
Secure microkernel that uses maths to be 'bug free' goes open source
Hacker-repelling, drone-protecting code will soon be yours to tweak as you see fit
KDE releases ice-cream coloured Plasma 5 just in time for summer
Melty but refreshing - popular rival to Mint's Cinnamon's still a work in progress
NO MORE ALL CAPS and other pleasures of Visual Studio 14
Unpicking a packed preview that breaks down ASP.NET
Cheer up, Nokia fans. It can start making mobes again in 18 months
The real winner of the Nokia sale is *drumroll* ... Nokia
Put down that Oracle database patch: It could cost $23,000 per CPU
On-by-default INMEMORY tech a boon for developers ... as long as they can afford it
Another day, another Firefox: Version 31 is upon us ALREADY
Web devs, Mozilla really wants you to like this one
Google shows off new Chrome OS look
Athena springs full-grown from Chromium project's head
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
Application security programs and practises
Follow a few strategies and your organization can gain the full benefits of open source and the cloud without compromising the security of your applications.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
Securing Web Applications Made Simple and Scalable
Learn how automated security testing can provide a simple and scalable way to protect your web applications.