Feeds

How von Neumann still controls the desktop

Putting the business first

  • alert
  • submit to reddit

Internet Security Threat Report 2014

Workshop When John Von Neumann first wrote up his notes about the logical design of the EDVAC computer on a train journey to Los Alamos in 1946, it is unlikely that he fully appreciated the impact they would have.

For all their complexity, cores and threads, their caches and bus architectures, modern computers still follow which is known (no doubt to the consternation of his team mates, whose names were excluded from the report) as the von Neumann architecture. Simply put, this involves one element of the computer processor dealing with the arithmetic operations, another dealing with controlling what happens, and some memory or storage capability shared between the two.

It’s an important point that, while desktop computers are small, sleek and designed for use by a single individual, and server computers tend to be rack-mounted and hidden in computer racks and equipment rooms, they both do pretty much the same thing. This principle is at the heart of a fundamental trade-off when it comes to designing IT hardware architectures – that the processing and storage has to take place somewhere. If the workload is not going to be on the desktop, it’s going to take up CPU cycles and disk capacity on the server, and the requisite information needs to get from one to the other in some way.

This may sound obvious, but it’s a principle that is frequently forgotten. In the heyday of 3D environments such as Second Life, for example, one anecdote doing the rounds was that it could take up to one server’s worth of resources for a single avatar. While these discussions have gone by the wayside (as has Second Life, to many), the fact it caused a stir at all, illustrates the ‘out of sight, out of mind’ attitude that can be associated with server-side processing.

Of course, when it comes to deciding whether something should be run on server or on a desktop/laptop client, processing power and data storage are not the only criteria to be balanced. Other factors include:

• The level of security required on the client, in that it is much easier to provide a locked down environment if the majority of processing (and hence data storage) takes place server-side.

• Networking considerations, both in terms of available bandwidth and network reliability – if either are poor, it makes more sense to load up a more powerful client.

• Management considerations, in terms of both the centralised monitoring and control of applications being run, and the flexibility to be provided to the user for configuring their own desktop.

• Relative costs, for example, the cost of bandwidth can vary depending on where the client is located at any point in time.

Given such factors, it’s unlikely that any organisation can arrive at a single desktop configuration that suits all types of users at an appropriate cost. There’s a wealth of options available today, from various flavours of virtualisation (virtual desktop infrastructure, session virtualisation and application streaming, for example), to browser-based interfaces onto in-house and hosted applications. As smart phones get smarter, and new form factors such as net books and tablets start to emerge, the range of options increases still further. As a result, it can become quite bewildering to decide where workloads and data should actually exist, whether or not direct user intervention is required.

Faced with this broadening catalogue of possibilities, our advice is quite simple: start with business users, their needs and the constraints they face. Users tend to fall into a reasonably small set of categories dependent on their jobs, their working practices and constraints – for example if they work from home or in the office, whether they handle sensitive data and so on. We’ve used such categorisations as the basis for data cuts in our research, and we know from experience just how useful they can be when it comes to identifying better-bounded groups whose needs can be dealt with specifically.

While user categorisation offers a starting point, the second ‘gotcha’ concerns future safety. The law of unintended usage models comes into play here – from my days as an IT manager, I am quite familiar with thinking, “But that’s not how it is supposed to be used!” Sometimes this may be due to users doing things without thinking about the consequences – we heard one story about a user streaming catch-up TV shows via their VPN link onto their virtual desktop for example, clogging up both network bandwidth and server resources.

It would be all too easy to size a desktop environment for a given set of users who appear to have relatively modest processing or networking requirements, only to find that they quickly become inadequate. On other occasions, it can be parallel initiatives, for example rolling out unified communications technologies that prove too much for the architecture as defined. In either case, it will be the help desk that suffers, so it is worth working through a few scenarios and keeping tabs on other projects, to ensure that the potential for such risks is minimised.

We may still be reliant on the von Neumann architecture when it comes to computing. Whether or not this changes, and whatever compute models emerge in the future, the advice to put the business first will remain. ®

Intelligent flash storage arrays

More from The Register

next story
Chipmaker FTDI bricking counterfeit kit
USB-serial imitators whacked by driver update
Xperia Z3: Crikey, Sony – ANOTHER flagship phondleslab?
The Fourth Amendment... and it IS better
DOUBLE BONK: Testy fanbois catch Apple Pay picking pockets
Users wail as tapcash transactions are duplicated
Microsoft to enter the STRUGGLE of the HUMAN WRIST
It's not just a thumb war, it's total digit war
Google Glassholes are UNDATEABLE – HP exec
You need an emotional connection, says touchy-feely MD... We can do that
prev story

Whitepapers

Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.