Feeds

The 'third era' of app development will be fast, simple, and compact

Will Intel and Nvidia join the HSA party, or insist on going it alone?

Build a business case: developing custom apps

Hot Chips At the annual Hot Chips symposium on high-performance chippery on Sunday, the assembled chipheads were led through a four-hour deep dive into the latest developments on marrying the power of CPUs, GPUs, DSPs, DMA engines, codecs, and other accelerators through the development of an open source programming model.

The tutorial was conducted by members of the HSA – heterogeneous system architecture – Foundation, a consortium of SoC vendors and IP designers, software companies, academics, and others including such heavyweights as ARM, AMD, and Samsung. The mission of the Foundation, founded last June, is "to make it dramatically easier to program heterogeneous parallel devices."

As the HSA Foundation explains on its website, "We are looking to bring about applications that blend scalar processing on the CPU, parallel processing on the GPU, and optimized processing of DSP via high bandwidth shared memory access with greater application performance at low power consumption."

Last Thursday, HSA Foundation president and AMD corporate fellow Phil Rogers provided reporters with a pre-briefing on the Hot Chips tutorial, and said the holy grail of transparent "write once, use everywhere" programming for shared-memory heterogeneous systems appears to be on the horizon.

According to Rogers, heterogeneous computing is nothing less than the third era of computing, the first two being the single-core era and the muti-core era. In each era of computing, he said, the first programming models were hard to use but were able to harness the full performance of the chips.

"In the case of single core," Rogers said, "we started with assembly code, then we went to much better abstractions: structured languages, objected-oriented languages, managed languages. At each stage you give up a little bit of performance for massive improvements in productivity, and the platform volumes grow extremely fast as programmers can use the platforms much more efficiently."

The same thing happened in the multi-core era, he said, moving from direct-thread programming to directive programming to task-parallel runtimes. In heterogeneous programming, however, that progression is just beginning. "We've gone from people writing shaders directly," he said, to proprietary languages such as CUDA, to open-standard languages such as OpenCL and C++ AMP.

"But ultimately," he said, "where the platform is going with HSA is to full programming languages like C++ and Java and many others."

Slide from HSA Foundation Hot Chips Pre-Briefing: History of Single-Core, Multi-Core, and Heterogeneous Era Computing

We're entering the third era of computing: the Heterogeneous Systems Era (click to enlarge)

Exactly how HSA will get there is not yet fully defined, but a number of high-level features are accepted. Unified memory addressing across all processor types, for example, is a key feature of HSA. "It's fundamental that we can allocate memory on one processor," Rogers said, "pass a pointer to another processor, and execute on that data – we move the compute rather than the data."

Full memory coherency, for another example, eliminates the need for software to manage caches. An architecture-queuing language, Rogers said, will allow an application or a library to dispatch packets to a GPU in what he called a "vendor-agnostic" manner. To enable preemption and context switching for a variety of applications and application types, HSA will support time-slicing throughout the entire collection of processor types.

Rogers took pains to emphasize that HSA is "defined from the outset" to be an open platform, with its specifications owned by the Foundation and delivered by means of a royalty-free standard. "It's designed from the ground up to be ISA-agnostic for both the CPU and the GPU – obviously that's very important," he said, a shared goal that's reflected in the range of hardware, operating system, tools, and middleware companies that have signed on as Foundation members.

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
Why has the web gone to hell? Market chaos and HUMAN NATURE
Tim Berners-Lee isn't happy, but we should be
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
'Stop dissing Google or quit': OK, I quit, says Code Club co-founder
And now a message from our sponsors: 'STFU or else'
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
Scratched PC-dispatch patch patched, hatched in batch rematch
Windows security update fixed after triggering blue screens (and screams) of death
This is how I set about making a fortune with my own startup
Would you leave your well-paid job to chase your dream?
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.