This article is more than 1 year old

Stanford grabs $6m to shape the future of software

Virtual worlds, robots and huge databases covered

After looking at the list of supporting vendors, you might think Stanford is trying to serve too many masters. A company such as Intel wants to see people focus on the x86 instruction set, while Nvidia wants this type of research done via its CUDA development platform and Tesla GPGPU hardware. But Olukotun said Stanford hopes to create a software development environment that's one level up from the silicon innards and added that anyone is welcome to join the project, which will release its code under an open source license.

"You might have a very heterogeneous mix of hardware and software coming together for the same application," he said. "A video game can manage virtual objects through dynamic threading but also use streaming techniques for physical world modeling. Then, there might be a 3-D graphics component that requires its own form of parallelism."

Cray stands as a company that has articulated a similar vision for how it plans to handle software jobs in the future. Early on in the next decade, Cray is set to release a line of systems dubbed "Cascade" that will have x86 chips handle the brunt of software work and then more specialized vector and FPGA chips tackle specific operations and heavily multi-threaded code. The problem with such an approach is coming up with a management system that can feed the right bits of code to the right chips and for this system to be easy enough for other developers to handle.

"That is why we think this type of research is required," Olukotun said. "I am always a strong believer in coming up with a clever idea. Things that might have seemed difficult at the first look can often be solved."

Stanford has dubbed its heterogeneous testing system FARM (Flexible Architecture Research Machine) and plans to get the system up and running by the end of the summer with a mix of FPGAs, GPGPUs and general purpose chips.

About 11 professors will work for the Pervasive Parallelism Lab (PPL), including all-stars Bill Dally - a JASON who has done a lot of stream processor work - and Stanford Prez and RISC legend John Hennessy, who always refuses our humble requests for an interview. Tsk, tsk.

Rather ironically, PPL will be housed at the Gates Computer Science building. A few years back, Gates complained to Intel that there was no way the company could keep up with multi-core chips and advised the silicon folks to stick with their single core efforts.

"No, Bill, it's not going to work that way," Intel replied.

There's an awful lot of cross-pollination between the Berkeley and Stanford camps, even though the two schools are rather vicious rivals. (It's said that Stanford makes companies, while Berkeley makes industries, which translates into "The Stanford guys are the rich ones.") Sun, for example, has long had its computing god Ivan Sutherland doing research at Berkeley and includes Berkeley legend Dave Patterson as an adviser. Meanwhile, Sun bought Olukotun's start-up Afara to create its Niagara line of chips and is now funding PPL. Intel is also working with just about anyone it can, including both schools and Cray, so that it can come up with software that makes people need to buy these damned multi-core chips.

We'll be sure to judge who has made the most progress between Berkeley and Stanford in three years.

Also, in the interest of grand-standing, I'd like to issue a challenge to Mr. Patterson. Kunle said he's willing for us to record a chat between the two of you around your various plans and possible disagreements that such parallelism work should take. Are you up to the challenge, Dave? ®

More about

TIP US OFF

Send us news


Other stories you might like