Feeds

Make way for the 64 bit revolution

Let the memory live again

Security and trust: The backbone of doing business over the internet

Cast your mind back 15 years, if you can. Believe it or not, that was when the IT industry suddenly started to get excited about 64 bit computing.

IBM (in conjunction with Apple and Motorola) had already embarked on the design of the 64 bit Power PC chip and Digital announced the 64 bit Alpha chip (you remember Digital, surely. It was eaten by Compaq, which was eaten in turn by HP). With the advent of the (now defunct) Alpha chip, commentators everywhere were trumpeting the dawn of 64 bit computing.

Well, it has been one of the longest dawns ever. At the time Intel didn't care about 64 bit computing and the PC industry didn't either. It took Intel about another 10 years to care about 64 bits and even then it seemed to be pushed into it by competition from AMD. But, nevertheless, it happened in the end and now we're all living in a 64 bit world...or are we?

Where 64 bits mattered

We entered a 64 bit world 15 years ago, but the simple fact was that it made no difference to most applications. The 64 bits, if you didn't know, refers to the size of the physical storage address that the on-chip instructions use. A 64 bit chip has 64 bit registers and a 64 bit data path, which means it can address a vast amount of memory.

How much memory?

In the region of 18 Exabytes. An Exabyte is a billion gigabytes and 18 Exabytes is greater than the sum total of all the worlds' data (which is a mere eight Exabytes or so).

But so what? 32 bits was enough to address two gigabytes of data and very few applications needed to address that much data. The other advantage of a 64 bit processor is that it does 64 bit arithmetic a great deal faster, which is useful in heavyweight scientific applications, but little use on the PC.

And if you don't actually need 64 bits, there's a penalty for using it. The vast majority of applications perform better compiled as 32 bit because the executable files are smaller so you can fit more executable code in the chip cache. So 64 bit performance can actually be poorer.

But 64 bits makes a big performance difference if you need to directly address more than two gigabytes of information and needing to do this is very common in data warehouse applications where terabytes of data are under management. Indeed, IBM's iSeries and pSeries had a big advantage in this kind of application for a while because of their 64 bit capabilities. The major database products moved to having 64 bit implementations very quickly once 64 bit servers were available. Scientific computing moved quickly in that direction too.

Today's 64 bit applications

The reason for writing this article is to point out that the IT industry is rapidly arriving at the point where 64 bits will make a difference to many things...enough to drag us all into a 64 bit world. First consider video. An hour of video at a reasonable resolution (for a PC or Mac) will usually occupy more than a gigabyte. If it is HD (high definition) then it will occupy about four times as much space. So, if you want to manipulate video in any way or just address part of a video file, 64 bits suddenly makes a big difference.

Now think of the PC or Mac you might buy soon. If it runs Windows Vista then think in terms of two gigabytes of memory, and if you're doing anything graphical at all on the Mac then two gigabytes is the base requirement (in my opinion). The personal computer has crossed the 64 bit line.

Now think of computer grids, especially loosely coupled ones which might be assembled on the fly. Such a grid will be managed much more effectively with 64 bit addressing. Now think of managing a large corporate network as though it were a single computer. The simple fact is that it will be far more effective with a single addressing scheme that can apply to the whole network.

A 64 bit revolution

There is, quite possibly, a genuine 64 bit revolution that is likely to occur. Just combine 64 bit addressing with the fact that memory is gradually replacing disk as the natural place to store online information and you have an architectural revolution in the offing.

The truth is that such a revolution began quite a while ago with the idea of virtualised operating environments and the separation of disk resources as NAS or SANs. But it hasn't yet got to the point where the industry is thinking in terms of memory based architectures. This will happen, and it is likely to happen soon.

And it will be a good thing too, making streaming applications and database applications far more efficient than they currently are. It's odd to think about it this way, but nearly all the applications we run are built on the assumption that the primary copy of the data is held on a spinning disk. Pretty soon all such applications will be legacy applications.

Copyright © 2007, IT-Analysis.com

Providing a secure and efficient Helpdesk

More from The Register

next story
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
'People have forgotten just how late the first iPhone arrived ...'
Plus: 'Google's IDEALISM is an injudicious justification for inappropriate biz practices'
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
prev story

Whitepapers

Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.