Original URL: http://www.theregister.co.uk/2008/04/30/stanford_funding_ppl/

Stanford grabs $6m to shape the future of software

Virtual worlds, robots and huge databases covered

By Ashlee Vance

Posted in Servers, 30th April 2008 23:47 GMT

Stanford University has mounted some gun turrets to its Ivory Towers. Just a few weeks after rival UC Berkeley revealed a mega-funding engagement with Microsoft and Intel around multi-threaded software, Stanford has returned fire by grabbing money from just about every other vendor on the planet with interest in improving code for multi-core chips for similar research.

In the coming days, Stanford will unveil the Pervasive Parallelism Lab on the back of $6m in funding over three years from Sun Microsystems, AMD, Nvidia, IBM, HP and even Intel. The lab will be headed by Kunle Olukotun, a Stanford professor, who made a major name for himself in the multi-core world by helping originate chips now sold by Sun as part of its Niagara or UltraSPARCTx family. The goal of the lab will be to make writing software for multi-core chips easier.

Last month,, Microsoft and Intel announced a deal to throw $10m each at Berkeley and the University of Illinois for similar research into parallel programming techniques. Stanford had been a finalist for that grant but lost out and then oddly leaked word of the grant to the press as some sort of revenge act against Berkeley.

You can understand the hardware vendors' desire to pay for this type of research, since they're all pumping out products centered on multi-core processor designs. Such gear does not boost single-threaded software performance as well as the single-core chips of yesteryear, since the individual cores in multi-core chips tend to be slower. As a result, programmers need to learn more complex multi-threaded programming techniques to spread jobs across the numerous cores at the same time.

In the server world, developers have been dealing with these issues for years and are able to make use of hundreds of processor cores. Still, the server set could use additional help dealing not only with general purpose CPUs but also a flood of accelerators hitting the market.

On the desktop front, there's a very pressing need to come to terms with multi-core chips, since major players including, er, Microsoft have been slow to face the oncoming realities posed by today's silicon.

The Berkeley group will focus on crafting desktop and mobile device software than can handle complex functions such as speech and video well.

Stanford, meanwhile, plans to go after three specific areas: virtual worlds, robotics and massive data analysis jobs, said Olukotun. Researchers hope to develop an abstracted programming framework that will make it easier for developers to create code in these three areas.

"We need to raise the level of abstraction for parallel programming," Olukotun told us. "Rather than teaching threading and locks, which are difficult even for experts to get right, we want to let programmers express their interests at levels appropriate to the different domains."

With the virtual world push, Stanford looks to offer a way to create systems far more impressive than the rather hapless Second Life. Such virtual systems might resemble today's high-end video games.

Robotics could benefit through the creation of devices capable of much more sophisticated operations. Such ambitions play into Stanford's role as a robotics powerhouse with the school's vehicle named Stanley coming in as one of the top two teams in the most recent DARPA-funded robot races.

"Stanley is a long way from a human driver," Olukotun said. "There is a lot more learning that has to go on with making artificial intelligence effective, and we think parallelism is the way."

With the data analysis, the Stanford researchers want to create systems capable of digesting huge volumes of financial and scientific data.

After looking at the list of supporting vendors, you might think Stanford is trying to serve too many masters. A company such as Intel wants to see people focus on the x86 instruction set, while Nvidia wants this type of research done via its CUDA development platform and Tesla GPGPU hardware. But Olukotun said Stanford hopes to create a software development environment that's one level up from the silicon innards and added that anyone is welcome to join the project, which will release its code under an open source license.

"You might have a very heterogeneous mix of hardware and software coming together for the same application," he said. "A video game can manage virtual objects through dynamic threading but also use streaming techniques for physical world modeling. Then, there might be a 3-D graphics component that requires its own form of parallelism."

Cray stands as a company that has articulated a similar vision for how it plans to handle software jobs in the future. Early on in the next decade, Cray is set to release a line of systems dubbed "Cascade" that will have x86 chips handle the brunt of software work and then more specialized vector and FPGA chips tackle specific operations and heavily multi-threaded code. The problem with such an approach is coming up with a management system that can feed the right bits of code to the right chips and for this system to be easy enough for other developers to handle.

"That is why we think this type of research is required," Olukotun said. "I am always a strong believer in coming up with a clever idea. Things that might have seemed difficult at the first look can often be solved."

Stanford has dubbed its heterogeneous testing system FARM (Flexible Architecture Research Machine) and plans to get the system up and running by the end of the summer with a mix of FPGAs, GPGPUs and general purpose chips.

About 11 professors will work for the Pervasive Parallelism Lab (PPL), including all-stars Bill Dally - a JASON who has done a lot of stream processor work - and Stanford Prez and RISC legend John Hennessy, who always refuses our humble requests for an interview. Tsk, tsk.

Rather ironically, PPL will be housed at the Gates Computer Science building. A few years back, Gates complained to Intel that there was no way the company could keep up with multi-core chips and advised the silicon folks to stick with their single core efforts.

"No, Bill, it's not going to work that way," Intel replied.

There's an awful lot of cross-pollination between the Berkeley and Stanford camps, even though the two schools are rather vicious rivals. (It's said that Stanford makes companies, while Berkeley makes industries, which translates into "The Stanford guys are the rich ones.") Sun, for example, has long had its computing god Ivan Sutherland doing research at Berkeley and includes Berkeley legend Dave Patterson as an adviser. Meanwhile, Sun bought Olukotun's start-up Afara to create its Niagara line of chips and is now funding PPL. Intel is also working with just about anyone it can, including both schools and Cray, so that it can come up with software that makes people need to buy these damned multi-core chips.

We'll be sure to judge who has made the most progress between Berkeley and Stanford in three years.

Also, in the interest of grand-standing, I'd like to issue a challenge to Mr. Patterson. Kunle said he's willing for us to record a chat between the two of you around your various plans and possible disagreements that such parallelism work should take. Are you up to the challenge, Dave? ®