Feeds

Make way for the 64 bit revolution

Let the memory live again

Combat fraud and increase customer satisfaction

Cast your mind back 15 years, if you can. Believe it or not, that was when the IT industry suddenly started to get excited about 64 bit computing.

IBM (in conjunction with Apple and Motorola) had already embarked on the design of the 64 bit Power PC chip and Digital announced the 64 bit Alpha chip (you remember Digital, surely. It was eaten by Compaq, which was eaten in turn by HP). With the advent of the (now defunct) Alpha chip, commentators everywhere were trumpeting the dawn of 64 bit computing.

Well, it has been one of the longest dawns ever. At the time Intel didn't care about 64 bit computing and the PC industry didn't either. It took Intel about another 10 years to care about 64 bits and even then it seemed to be pushed into it by competition from AMD. But, nevertheless, it happened in the end and now we're all living in a 64 bit world...or are we?

Where 64 bits mattered

We entered a 64 bit world 15 years ago, but the simple fact was that it made no difference to most applications. The 64 bits, if you didn't know, refers to the size of the physical storage address that the on-chip instructions use. A 64 bit chip has 64 bit registers and a 64 bit data path, which means it can address a vast amount of memory.

How much memory?

In the region of 18 Exabytes. An Exabyte is a billion gigabytes and 18 Exabytes is greater than the sum total of all the worlds' data (which is a mere eight Exabytes or so).

But so what? 32 bits was enough to address two gigabytes of data and very few applications needed to address that much data. The other advantage of a 64 bit processor is that it does 64 bit arithmetic a great deal faster, which is useful in heavyweight scientific applications, but little use on the PC.

And if you don't actually need 64 bits, there's a penalty for using it. The vast majority of applications perform better compiled as 32 bit because the executable files are smaller so you can fit more executable code in the chip cache. So 64 bit performance can actually be poorer.

But 64 bits makes a big performance difference if you need to directly address more than two gigabytes of information and needing to do this is very common in data warehouse applications where terabytes of data are under management. Indeed, IBM's iSeries and pSeries had a big advantage in this kind of application for a while because of their 64 bit capabilities. The major database products moved to having 64 bit implementations very quickly once 64 bit servers were available. Scientific computing moved quickly in that direction too.

Today's 64 bit applications

The reason for writing this article is to point out that the IT industry is rapidly arriving at the point where 64 bits will make a difference to many things...enough to drag us all into a 64 bit world. First consider video. An hour of video at a reasonable resolution (for a PC or Mac) will usually occupy more than a gigabyte. If it is HD (high definition) then it will occupy about four times as much space. So, if you want to manipulate video in any way or just address part of a video file, 64 bits suddenly makes a big difference.

Now think of the PC or Mac you might buy soon. If it runs Windows Vista then think in terms of two gigabytes of memory, and if you're doing anything graphical at all on the Mac then two gigabytes is the base requirement (in my opinion). The personal computer has crossed the 64 bit line.

Now think of computer grids, especially loosely coupled ones which might be assembled on the fly. Such a grid will be managed much more effectively with 64 bit addressing. Now think of managing a large corporate network as though it were a single computer. The simple fact is that it will be far more effective with a single addressing scheme that can apply to the whole network.

A 64 bit revolution

There is, quite possibly, a genuine 64 bit revolution that is likely to occur. Just combine 64 bit addressing with the fact that memory is gradually replacing disk as the natural place to store online information and you have an architectural revolution in the offing.

The truth is that such a revolution began quite a while ago with the idea of virtualised operating environments and the separation of disk resources as NAS or SANs. But it hasn't yet got to the point where the industry is thinking in terms of memory based architectures. This will happen, and it is likely to happen soon.

And it will be a good thing too, making streaming applications and database applications far more efficient than they currently are. It's odd to think about it this way, but nearly all the applications we run are built on the assumption that the primary copy of the data is held on a spinning disk. Pretty soon all such applications will be legacy applications.

Copyright © 2007, IT-Analysis.com

High performance access to file storage

More from The Register

next story
This time it's 'Personal': new Office 365 sub covers just two devices
Redmond also brings Office into Google's back yard
Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
Admins dab straining server brows in advance of Trusty Tahr's long-term support landing
Microsoft lobs pre-release Windows Phone 8.1 at devs who dare
App makers can load it before anyone else, but if they do they're stuck with it
Half of Twitter's 'active users' are SILENT STALKERS
Nearly 50% have NEVER tweeted a word
Oh no, Joe: WinPhone users already griping over 8.1 mega-update
Hang on. Which bit of Developer Preview don't you understand?
Internet-of-stuff startup dumps NoSQL for ... SQL?
NoSQL taste great at first but lacks proper nutrients, says startup cloud whiz
Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
Pre-Update versions of new Windows version will no longer support patches
Ditch the sync, paddle in the Streem: Upstart offers syncless sharing
Upload, delete and carry on sharing afterwards?
Microsoft TIER SMEAR changes app prices whether devs ask or not
Some go up, some go down, Redmond goes silent
prev story

Whitepapers

Designing a defence for mobile apps
In this whitepaper learn the various considerations for defending mobile applications; from the mobile application architecture itself to the myriad testing technologies needed to properly assess mobile applications risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Securing web applications made simple and scalable
In this whitepaper learn how automated security testing can provide a simple and scalable way to protect your web applications.