Feeds

Make way for the 64 bit revolution

Let the memory live again

Top 5 reasons to deploy VMware with Tegile

Cast your mind back 15 years, if you can. Believe it or not, that was when the IT industry suddenly started to get excited about 64 bit computing.

IBM (in conjunction with Apple and Motorola) had already embarked on the design of the 64 bit Power PC chip and Digital announced the 64 bit Alpha chip (you remember Digital, surely. It was eaten by Compaq, which was eaten in turn by HP). With the advent of the (now defunct) Alpha chip, commentators everywhere were trumpeting the dawn of 64 bit computing.

Well, it has been one of the longest dawns ever. At the time Intel didn't care about 64 bit computing and the PC industry didn't either. It took Intel about another 10 years to care about 64 bits and even then it seemed to be pushed into it by competition from AMD. But, nevertheless, it happened in the end and now we're all living in a 64 bit world...or are we?

Where 64 bits mattered

We entered a 64 bit world 15 years ago, but the simple fact was that it made no difference to most applications. The 64 bits, if you didn't know, refers to the size of the physical storage address that the on-chip instructions use. A 64 bit chip has 64 bit registers and a 64 bit data path, which means it can address a vast amount of memory.

How much memory?

In the region of 18 Exabytes. An Exabyte is a billion gigabytes and 18 Exabytes is greater than the sum total of all the worlds' data (which is a mere eight Exabytes or so).

But so what? 32 bits was enough to address two gigabytes of data and very few applications needed to address that much data. The other advantage of a 64 bit processor is that it does 64 bit arithmetic a great deal faster, which is useful in heavyweight scientific applications, but little use on the PC.

And if you don't actually need 64 bits, there's a penalty for using it. The vast majority of applications perform better compiled as 32 bit because the executable files are smaller so you can fit more executable code in the chip cache. So 64 bit performance can actually be poorer.

But 64 bits makes a big performance difference if you need to directly address more than two gigabytes of information and needing to do this is very common in data warehouse applications where terabytes of data are under management. Indeed, IBM's iSeries and pSeries had a big advantage in this kind of application for a while because of their 64 bit capabilities. The major database products moved to having 64 bit implementations very quickly once 64 bit servers were available. Scientific computing moved quickly in that direction too.

Today's 64 bit applications

The reason for writing this article is to point out that the IT industry is rapidly arriving at the point where 64 bits will make a difference to many things...enough to drag us all into a 64 bit world. First consider video. An hour of video at a reasonable resolution (for a PC or Mac) will usually occupy more than a gigabyte. If it is HD (high definition) then it will occupy about four times as much space. So, if you want to manipulate video in any way or just address part of a video file, 64 bits suddenly makes a big difference.

Now think of the PC or Mac you might buy soon. If it runs Windows Vista then think in terms of two gigabytes of memory, and if you're doing anything graphical at all on the Mac then two gigabytes is the base requirement (in my opinion). The personal computer has crossed the 64 bit line.

Now think of computer grids, especially loosely coupled ones which might be assembled on the fly. Such a grid will be managed much more effectively with 64 bit addressing. Now think of managing a large corporate network as though it were a single computer. The simple fact is that it will be far more effective with a single addressing scheme that can apply to the whole network.

A 64 bit revolution

There is, quite possibly, a genuine 64 bit revolution that is likely to occur. Just combine 64 bit addressing with the fact that memory is gradually replacing disk as the natural place to store online information and you have an architectural revolution in the offing.

The truth is that such a revolution began quite a while ago with the idea of virtualised operating environments and the separation of disk resources as NAS or SANs. But it hasn't yet got to the point where the industry is thinking in terms of memory based architectures. This will happen, and it is likely to happen soon.

And it will be a good thing too, making streaming applications and database applications far more efficient than they currently are. It's odd to think about it this way, but nearly all the applications we run are built on the assumption that the primary copy of the data is held on a spinning disk. Pretty soon all such applications will be legacy applications.

Copyright © 2007, IT-Analysis.com

Providing a secure and efficient Helpdesk

More from The Register

next story
Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...
Wobbly Gmail, Contacts, Calendar on the other hand ...
Preview redux: Microsoft ships new Windows 10 build with 7,000 changes
Latest bleeding-edge bits borrow Action Center from Windows Phone
Microsoft promises Windows 10 will mean two-factor auth for all
Sneak peek at security features Redmond's baking into new OS
Google opens Inbox – email for people too stupid to use email
Print this article out and give it to someone techy if you get stuck
UNIX greybeards threaten Debian fork over systemd plan
'Veteran Unix Admins' fear desktop emphasis is betraying open source
DEATH by PowerPoint: Microsoft warns of 0-day attack hidden in slides
Might put out patch in update, might chuck it out sooner
Redmond top man Satya Nadella: 'Microsoft LOVES Linux'
Open-source 'love' fairly runneth over at cloud event
prev story

Whitepapers

Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.