Feeds

Facing up to parallelism

Multicore means today's HPC is tomorrow's general purpose

Business security measures using SSL

There is, of course, a great deal known about parallel programming and there are already two promising programming approaches that Smith is pursing. One is functional programming and the other is atomic memory transactions. Neither is a complete answer in itself, of course. Functional programming, for example, does not allow mutable states to exist, while atomic memory transactions implement dependence awkwardly. The use of such technologies in mainstream computing is new ground and he acknowledged that atomic memory transaction technology already has critics claiming it is doomed to be too slow. He also pointed out that this was still to be shown as a permanent condition.

He did highlight two functional programming languages, Sisal and NESL, for specific mention, however. "Critics say that functional languages are inefficient, but these two are excellent counter-examples. On Cray systems they could run as fast as Fortran."

One of the issues that confronts programmers moving into the parallel processing world is the role of transactions in the management of invariants.

"Invariants are a program's conservation laws," he observed. "And there are rules of data structure, or state, integrity that need to be observed." These rules were developed in the paper Verifying properties of parallel programs: An axiomatic approach by Susan Owicki and David Gries, which sets out the following law:

If statements p and q preserve the invariant I and they do not "interfere", their parallel composition { p|| q} also preserves I.

Transactions then play their part as set out by Leslie Lamport and Fred Schneider in their paper The Hoare Logic of CSP, And All That:

If p and q are performed atomically, i.e. as transactions, then they will not interfere.

As Smith observed: "Although operations seldom commute with respect to state, transactions give us commutativity with respect to the invariant, and it would be nice if the invariants were available to the compiler if programmers can provide them readily."

For Microsoft programmers he pointed to the new C# and Visual Basic enhancements to be found with LINQ (Language Integrated Query) project, a set of extensions to the .NET Framework that add such capabilities as language-integrated query, set, and transform operations and can operate on data in memory or in an external database.

His view is that, for the immediate future, all the major styles of parallel programming should be supported. These include both functional and transactional styles, data parallel and task parallel, message passing and shared memory, declarative and imperative, and implicit and explicit styles.

"To cover all these will require more than one language, but then we use multiple languages today as it is. What is important is that the parallelism and locality are exposed to the compiler, so that the compiler can adapt them for the target system."

Here, he suggested that the ability to work with heterogeneous processors in any infrastructure would be an important capability. "We will need independence from the idiosyncrasies of the machine."

He also pointed delegates to the language interoperability available with .NET as a help. This would help provide automatic parallelisation, which he said many have already suggested is demonstrated failure. "What failed is parallelism discovery, particularly in-the-large. There is now a need to package parallelism, which means not worrying about how many cores are available at any one time or about whether you need to recompile the application for a different number of cores."

When it comes to debugging parallel applications, Smith suggested that setting conditional data breakpoints and ad-hoc data perusal are likely to be two important techniques for developers to learn. The former is a technique which stops the application if an invariant fails to be true, while the latter is a form of data mining application. Application tuning will also be important, particularly in identifying where there is insufficient parallelism available.

As a call-to-arms in facing up to what he sees as inevitable, he told delegates: "We have to rethink the basics of computing, but thanks to HPC we have a good starting point. It does mean, however, that many applications will have to be re-modeled and re-engineered from the strategy downwards." ®

New hybrid storage solutions

More from The Register

next story
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.