Feeds

Facing up to parallelism

Multicore means today's HPC is tomorrow's general purpose

Providing a secure and efficient Helpdesk

It is, perhaps, one of those forgotten facts that computing is still a relatively young technology, made all the more poignant by the realisation that many of the people driving the High Performance Computing (HPC) business, like Burton Smith, Microsoft's technical fellow in charge of advanced strategies and policies for the company, have not only been round the track several times, but are very much still at the bleeding edge of the technology.

Smith's track record includes many years as chief scientist at Cray, but his views now are hardly stuck in the past, for he believes that the parallel processing technologies that have been developed round HPC are where the mainstream of computing technology now has to head.

"We are now at the point where we are breaking the Von Neumann Assumption that there is only one program counter that allows the proper ordering and scheduling of variables," he said. "Parallel programming makes this hazardous, but we are also now at the point where serial programs are becoming slow programs."

Driving this is the arrival of multicore processor chips into the mainstream of computing. As the only way to get more performance of a single threaded processor is to increase its speed, and the only way to do that is via increased power consumption and all the costs associated with it, multicore chips offer a different, but inherently parallel alternative to boosting performance, and performance has always been the chief characteristic of HPC systems. So the lessons learned there can now start to be applied in general purpose computing.

"Computing must be reinvented, but many of those who invented computing are still alive. We did it once and we can do it again," he said.

"Reinvention" is, however, a potentially scary word, and Smith is aware of the dangers. This is particularly the case where there is such a long-standing installed base of applications, process, and operating methods as found in the mainstream business computing arena.

"Reinvention" could make all of that obsolete almost over night. It is not a route that he favours, however. "One option with the move to parallelisation is to simply wipe the slate clean and start again with something new," he said. "This is what I call the Apple approach, where a great new technology is introduced with not much thought given to the pain it might cause users of an earlier technology. But we have to take existing users with us."

Smith used his keynote presentation at the recent International Supercomputing Conference in Dresden, to take a look at what is happening to computing as a whole. The fundamental, he suggested, is that uniprocessor performance is levelling off and instruction levels, power consumption and cache limitations are all "walls" that are now being hit. And the fact that we now have multicore processors doesn't change this if the architecture hasn't changed, which then means that they become difficult to program.

The Instruction Level Wall is constructed from the limits of the uniprocessor instruction architecture, which are now being reached. There are issues that restrict the level of concurrency possible in a system, such as control dependent computation and data dependent memory addressing, and they collectively limit such architectures to a few instructions per clock cycle.

The Power Wall is now coming into play more significantly. As an example he noted that it is possible to scale hardware by Sigma, but that the power will scale by Sigma as well. Scaling the clock frequency by Sigma is worse, for it scales the dynamic power by Sigma cubed.

The Memory Wall needs not only bigger cache sizes, but also the ability to cut the cache miss-rate in half. In addition, the actual size of the growth in cache capacity will be driven by the type of data being fetched and stored. The more complex, the greater the cache needs to be. For example, if the data is intended for dense matrix-matrix multiply functions, then the cache needs to be four times bigger. If it is for a Fast Fourier transform it has to be the square of the original cache to half the miss-rate. So there are issues here in not only increasing cache size, but also increasing the bandwidth and reducing the latency of the channel serving the cache.

HPC technologies have, over the years, developed solutions to these problems. But they have also suffered from being caught in something of a self-serving spiral. As Smith put it: "HPC systems have been the ones that run HPC applications, while HPC applications are the ones that run on HPC systems."

So it might have remained if it had not been for the application of dual-core, and now multicore, processors across the board. The same fundamental techniques of parallel processing now start to apply equally to mainstream business applications as to the most complex weather forecasting system.

Internet Security Threat Report 2014

More from The Register

next story
Microsoft on the Threshold of a new name for Windows next week
Rebranded OS reportedly set to be flung open by Redmond
SMASH the Bash bug! Apple and Red Hat scramble for patch batches
'Applying multiple security updates is extremely difficult'
Business is back, baby! Hasta la VISTA, Win 8... Oh, yeah, Windows 9
Forget touchscreen millennials, Microsoft goes for mouse crowd
Apple: SO sorry for the iOS 8.0.1 UPDATE BUNGLE HORROR
Apple kills 'upgrade'. Hey, Microsoft. You sure you want to be like these guys?
ARM gives Internet of Things a piece of its mind – the Cortex-M7
32-bit core packs some DSP for VIP IoT CPU LOL
Lotus Notes inventor Ozzie invents app to talk to people on your phone
Imagine that. Startup floats with voice collab app for Win iPhone
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Intelligent flash storage arrays
Tegile Intelligent Storage Arrays with IntelliFlash helps IT boost storage utilization and effciency while delivering unmatched storage savings and performance.
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.