The long slog to multicore land
Chip future must wait
Multicore Expo Open standards and development tools are needed to accelerate the transition to a multicore future, according to minds at this week's Multicore Expo in Santa Clara, California.
As Markus Levy, president of the Multicore Association, put it: "Hardware's easy." Brian Carlson, Texas Instrument's Open Multimedia Application Platform (OMAP) marketing Manager agrees, saying, "The hardware's in place already to enable some really exciting applications that just aren't there yet.
"The challenge is closing that gap between hardware and software."
And when Carlson and others talk about multicore hardware, they're not simply talking about multicore processors such as the familiar CPUs from Intel and AMD. They're talking about multicore systems-on-chip (SoCs), digital signal processors (DSPs), graphic processing units (GPUs), and other specialized silicon.
Creating applications that run seamlessly on a diverse array of multicore silicon is no easy feat - especially when those elements are sharing processing duties by means of emerging industry standards such as OpenCL.
Levy says that what has yet to be solved is the methodology and not the technology. "Everybody knows these days," he says, "that the technology seems to be running at a much faster pace than the methodology."
Key to multicore methodology are parallel-programming approaches, "which everybody is going crazy about trying to figure out," says Levy. "How do you debug these things? What kind of new problems arise as you move into multicore land?"
To make matters even tougher, there's the problem of legacy-code conversion, which Levy described as "How do I take my quadrillion lines of code and convert it to a multicore platform?"
What's missing is a widely accepted, standardized multicore programming model that goes beyond traditional symmetric multiprocessing (SMP). SMP is well-suited for a world in which identical processor cores sit in their own packages and communicate over external datapaths, but it's a limited model for cores of different types - CPUs, GPUs, DSPs, and so on - that share resources and tasks on a single die.
Without open standards such as OpenCL, all developers will be forced to invent their own methodologies. Carlson was emphatic about standards, calling them "critical." Nokia's Kari Pulli agreed, saying that "We need to create something that protects developers from change - that's why standards are crucial."
Without an appropriate set of standards, many multicore system and software designers are developing their own proprietary systems, which limits software from being used on multiple platforms. If a developer creates a vertical product - whether hardware or software - it stays in that vertical space, able to run only the software designed specifically for it. Also, that software can't be run on other silicon.
That's a recipe for a fragmented market and for end-users getting locked into a particular vendor's "solution."
As Levy says, "There's a lack of a flexible general-purpose multicore programming model," a situation that "limits a programmer's ability to transition to multicore programming. Everything has to be hand-done and redone."
Help is on the way, however. For their part, the Multicore Association is currently working on a set of "cohesive" APIs (application programming interfaces - essentially agreed-upon sets of commands that software developers use to communicate with hardware) for multiple multicore platforms. The Association has completed a communications API and is working on one for application-level resource management. A task-management API is also in the works.
The Association also has a working group that's developing a "best practices guide" for multicore programming.
The development community is pitching in, as well. For example, PolyCore Software has been providing multicore tools since shortly after its founding in 2004, focusing primarily on the communications industry. At the Expo, the company announced Poly-Mapper, a new tool designed, as the company explains, "for rapid creation of validated multicore communications topologies."
PolyCore's contributions, however, are aimed deep inside multicore systems, focusing on such nitty-gritty as inter-core communications. At a slightly higher level comes help from CriticalBlue, which at the Expo introduced a new tool for embedded-system designers called Prism that lets developers test their existing linear code to see how it might benefit from a transition to a multicore SoC.
Tools are coming and standards are arriving along with APIs. But as Tony King-Smith, VP of marketing for chip-designer Imagination Technologies put it when referring to the OpenCL standard, "It's not a universal panacea - we have a couple years of pain ahead."
But the performance and power-saving promise of a multicore future is inarguable. It's just that the future's not here yet. ®
internet was still wearing nappies and crying for its DARPA.
I feel myself already becoming exasperated and preparing to respond[*] to your up-the-garden-path-with-a-herring assertions.
[*] In pictures, natch.
Programming Languages Considered Harmful
Programming languages are the CLIs of development tools. They're inherently linear, because *languages* are inherently linear -- we don't know how to read any other way. This was fine as long as computers relied on single-thread, in-step CPUs, but it's increasingly untenable today.
I started programming in the days of the ZX81, when BASIC still had line numbers and assembly languages were real, furry assembly languages designed by people who had bothered to look up "mnemonic" in a dictionary first.
I've seen people hype procedural programming, modular programming, OOP, functional programming and more. Yet those programming 'paradigms' are just attempts at nailing structural and organisational UI features onto a written language without any thought for whether this is the right place for it. It's 2009 and we're still using software development tools designed in the days when hard drives were called "Winchester Disks", punched cards and paper tape were still in use, graphics were monochrome and blocky, and the Internet was still wearing nappies and crying for its DARPA.
Sure, those tools have gained WIMP GUIs to help us place buttons and list-boxes, but look under the hood and you see the same old dumb, flat, text files and archaic, linear programming languages.
Programming languages have had their day. They're not the solution. They are the *problem*.
Well, fancy. You again.
Anyone offered you the $10 million for your idea yet? What's that - they haven't?
Ever bothered to read & understand the paper on threads you keep recommending (which undermines your position, not that you'd know as you haven't read it).
Ever got anything working?
Put up or shut up louis.
Aren't you that crackpot that thinks that motion in spacetime is impossible, and that the Bible explains how neural nets should be constructed?
Ah yes ...
How's COSA going? I noticed you've got a "non-algorithmic" implementation of the quicksort algorithm now. That must have taken some work!
BTW, did you ever get the timecube working or was that someone else?