The long slog to multicore land

Chip future must wait

Internet Security Threat Report 2014

Multicore Expo Open standards and development tools are needed to accelerate the transition to a multicore future, according to minds at this week's Multicore Expo in Santa Clara, California.

As Markus Levy, president of the Multicore Association, put it: "Hardware's easy." Brian Carlson, Texas Instrument's Open Multimedia Application Platform (OMAP) marketing Manager agrees, saying, "The hardware's in place already to enable some really exciting applications that just aren't there yet.

"The challenge is closing that gap between hardware and software."

And when Carlson and others talk about multicore hardware, they're not simply talking about multicore processors such as the familiar CPUs from Intel and AMD. They're talking about multicore systems-on-chip (SoCs), digital signal processors (DSPs), graphic processing units (GPUs), and other specialized silicon.

Creating applications that run seamlessly on a diverse array of multicore silicon is no easy feat - especially when those elements are sharing processing duties by means of emerging industry standards such as OpenCL.

Levy says that what has yet to be solved is the methodology and not the technology. "Everybody knows these days," he says, "that the technology seems to be running at a much faster pace than the methodology."

Key to multicore methodology are parallel-programming approaches, "which everybody is going crazy about trying to figure out," says Levy. "How do you debug these things? What kind of new problems arise as you move into multicore land?"

To make matters even tougher, there's the problem of legacy-code conversion, which Levy described as "How do I take my quadrillion lines of code and convert it to a multicore platform?"

What's missing is a widely accepted, standardized multicore programming model that goes beyond traditional symmetric multiprocessing (SMP). SMP is well-suited for a world in which identical processor cores sit in their own packages and communicate over external datapaths, but it's a limited model for cores of different types - CPUs, GPUs, DSPs, and so on - that share resources and tasks on a single die.

Without open standards such as OpenCL, all developers will be forced to invent their own methodologies. Carlson was emphatic about standards, calling them "critical." Nokia's Kari Pulli agreed, saying that "We need to create something that protects developers from change - that's why standards are crucial."

Without an appropriate set of standards, many multicore system and software designers are developing their own proprietary systems, which limits software from being used on multiple platforms. If a developer creates a vertical product - whether hardware or software - it stays in that vertical space, able to run only the software designed specifically for it. Also, that software can't be run on other silicon.

That's a recipe for a fragmented market and for end-users getting locked into a particular vendor's "solution."

As Levy says, "There's a lack of a flexible general-purpose multicore programming model," a situation that "limits a programmer's ability to transition to multicore programming. Everything has to be hand-done and redone."

Help is on the way, however. For their part, the Multicore Association is currently working on a set of "cohesive" APIs (application programming interfaces - essentially agreed-upon sets of commands that software developers use to communicate with hardware) for multiple multicore platforms. The Association has completed a communications API and is working on one for application-level resource management. A task-management API is also in the works.

The Association also has a working group that's developing a "best practices guide" for multicore programming.

The development community is pitching in, as well. For example, PolyCore Software has been providing multicore tools since shortly after its founding in 2004, focusing primarily on the communications industry. At the Expo, the company announced Poly-Mapper, a new tool designed, as the company explains, "for rapid creation of validated multicore communications topologies."

PolyCore's contributions, however, are aimed deep inside multicore systems, focusing on such nitty-gritty as inter-core communications. At a slightly higher level comes help from CriticalBlue, which at the Expo introduced a new tool for embedded-system designers called Prism that lets developers test their existing linear code to see how it might benefit from a transition to a multicore SoC.

Tools are coming and standards are arriving along with APIs. But as Tony King-Smith, VP of marketing for chip-designer Imagination Technologies put it when referring to the OpenCL standard, "It's not a universal panacea - we have a couple years of pain ahead."

But the performance and power-saving promise of a multicore future is inarguable. It's just that the future's not here yet. ®

Intelligent flash storage arrays


Choosing cloud Backup services
Demystify how you can address your data protection needs in your small- to medium-sized business and select the best online backup service to meet your needs.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.