Feeds

How von Neumann still controls the desktop

Putting the business first

  • alert
  • submit to reddit

Internet Security Threat Report 2014

Workshop When John Von Neumann first wrote up his notes about the logical design of the EDVAC computer on a train journey to Los Alamos in 1946, it is unlikely that he fully appreciated the impact they would have.

For all their complexity, cores and threads, their caches and bus architectures, modern computers still follow which is known (no doubt to the consternation of his team mates, whose names were excluded from the report) as the von Neumann architecture. Simply put, this involves one element of the computer processor dealing with the arithmetic operations, another dealing with controlling what happens, and some memory or storage capability shared between the two.

It’s an important point that, while desktop computers are small, sleek and designed for use by a single individual, and server computers tend to be rack-mounted and hidden in computer racks and equipment rooms, they both do pretty much the same thing. This principle is at the heart of a fundamental trade-off when it comes to designing IT hardware architectures – that the processing and storage has to take place somewhere. If the workload is not going to be on the desktop, it’s going to take up CPU cycles and disk capacity on the server, and the requisite information needs to get from one to the other in some way.

This may sound obvious, but it’s a principle that is frequently forgotten. In the heyday of 3D environments such as Second Life, for example, one anecdote doing the rounds was that it could take up to one server’s worth of resources for a single avatar. While these discussions have gone by the wayside (as has Second Life, to many), the fact it caused a stir at all, illustrates the ‘out of sight, out of mind’ attitude that can be associated with server-side processing.

Of course, when it comes to deciding whether something should be run on server or on a desktop/laptop client, processing power and data storage are not the only criteria to be balanced. Other factors include:

• The level of security required on the client, in that it is much easier to provide a locked down environment if the majority of processing (and hence data storage) takes place server-side.

• Networking considerations, both in terms of available bandwidth and network reliability – if either are poor, it makes more sense to load up a more powerful client.

• Management considerations, in terms of both the centralised monitoring and control of applications being run, and the flexibility to be provided to the user for configuring their own desktop.

• Relative costs, for example, the cost of bandwidth can vary depending on where the client is located at any point in time.

Given such factors, it’s unlikely that any organisation can arrive at a single desktop configuration that suits all types of users at an appropriate cost. There’s a wealth of options available today, from various flavours of virtualisation (virtual desktop infrastructure, session virtualisation and application streaming, for example), to browser-based interfaces onto in-house and hosted applications. As smart phones get smarter, and new form factors such as net books and tablets start to emerge, the range of options increases still further. As a result, it can become quite bewildering to decide where workloads and data should actually exist, whether or not direct user intervention is required.

Faced with this broadening catalogue of possibilities, our advice is quite simple: start with business users, their needs and the constraints they face. Users tend to fall into a reasonably small set of categories dependent on their jobs, their working practices and constraints – for example if they work from home or in the office, whether they handle sensitive data and so on. We’ve used such categorisations as the basis for data cuts in our research, and we know from experience just how useful they can be when it comes to identifying better-bounded groups whose needs can be dealt with specifically.

While user categorisation offers a starting point, the second ‘gotcha’ concerns future safety. The law of unintended usage models comes into play here – from my days as an IT manager, I am quite familiar with thinking, “But that’s not how it is supposed to be used!” Sometimes this may be due to users doing things without thinking about the consequences – we heard one story about a user streaming catch-up TV shows via their VPN link onto their virtual desktop for example, clogging up both network bandwidth and server resources.

It would be all too easy to size a desktop environment for a given set of users who appear to have relatively modest processing or networking requirements, only to find that they quickly become inadequate. On other occasions, it can be parallel initiatives, for example rolling out unified communications technologies that prove too much for the architecture as defined. In either case, it will be the help desk that suffers, so it is worth working through a few scenarios and keeping tabs on other projects, to ensure that the potential for such risks is minimised.

We may still be reliant on the von Neumann architecture when it comes to computing. Whether or not this changes, and whatever compute models emerge in the future, the advice to put the business first will remain. ®

Internet Security Threat Report 2014

More from The Register

next story
Oi, Tim Cook. Apple Watch. I DARE you to tell me, IN PERSON, that it's secure
State attorney demands Apple CEO bows the knee to him
Phones 4u website DIES as wounded mobe retailer struggles to stay above water
Founder blames 'ruthless network partners' for implosion
Will BlackBerry make a comeback with its SQUARE smartphones?
Plus PC PIMs from company formerly known as RIM
Apple's iPhone 6 first-day sales are MEANINGLESS, mutters analyst
Big weekend queues only represent fruity firm's supply
Hey, Mac fanbois. HGST wants you drooling over its HUGE desktop RACK
What vast digital media repository could possibly need 64 TERABYTES?
Soundbites: News in brief from the Wi-Fi audiophile files
DTS and Sonos sing out but not off the same hymnsheet
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
The next step in data security
With recent increased privacy concerns and computers becoming more powerful, the chance of hackers being able to crack smaller-sized RSA keys increases.