This article is more than 1 year old

Stanford grabs $6m to shape the future of software

Virtual worlds, robots and huge databases covered

Stanford University has mounted some gun turrets to its Ivory Towers. Just a few weeks after rival UC Berkeley revealed a mega-funding engagement with Microsoft and Intel around multi-threaded software, Stanford has returned fire by grabbing money from just about every other vendor on the planet with interest in improving code for multi-core chips for similar research.

In the coming days, Stanford will unveil the Pervasive Parallelism Lab on the back of $6m in funding over three years from Sun Microsystems, AMD, Nvidia, IBM, HP and even Intel. The lab will be headed by Kunle Olukotun, a Stanford professor, who made a major name for himself in the multi-core world by helping originate chips now sold by Sun as part of its Niagara or UltraSPARCTx family. The goal of the lab will be to make writing software for multi-core chips easier.

Last month,, Microsoft and Intel announced a deal to throw $10m each at Berkeley and the University of Illinois for similar research into parallel programming techniques. Stanford had been a finalist for that grant but lost out and then oddly leaked word of the grant to the press as some sort of revenge act against Berkeley.

You can understand the hardware vendors' desire to pay for this type of research, since they're all pumping out products centered on multi-core processor designs. Such gear does not boost single-threaded software performance as well as the single-core chips of yesteryear, since the individual cores in multi-core chips tend to be slower. As a result, programmers need to learn more complex multi-threaded programming techniques to spread jobs across the numerous cores at the same time.

In the server world, developers have been dealing with these issues for years and are able to make use of hundreds of processor cores. Still, the server set could use additional help dealing not only with general purpose CPUs but also a flood of accelerators hitting the market.

On the desktop front, there's a very pressing need to come to terms with multi-core chips, since major players including, er, Microsoft have been slow to face the oncoming realities posed by today's silicon.

The Berkeley group will focus on crafting desktop and mobile device software than can handle complex functions such as speech and video well.

Stanford, meanwhile, plans to go after three specific areas: virtual worlds, robotics and massive data analysis jobs, said Olukotun. Researchers hope to develop an abstracted programming framework that will make it easier for developers to create code in these three areas.

"We need to raise the level of abstraction for parallel programming," Olukotun told us. "Rather than teaching threading and locks, which are difficult even for experts to get right, we want to let programmers express their interests at levels appropriate to the different domains."

With the virtual world push, Stanford looks to offer a way to create systems far more impressive than the rather hapless Second Life. Such virtual systems might resemble today's high-end video games.

Robotics could benefit through the creation of devices capable of much more sophisticated operations. Such ambitions play into Stanford's role as a robotics powerhouse with the school's vehicle named Stanley coming in as one of the top two teams in the most recent DARPA-funded robot races.

"Stanley is a long way from a human driver," Olukotun said. "There is a lot more learning that has to go on with making artificial intelligence effective, and we think parallelism is the way."

With the data analysis, the Stanford researchers want to create systems capable of digesting huge volumes of financial and scientific data.

More about

TIP US OFF

Send us news


Other stories you might like