Intel and Microsoft dump $20m on researchers to avert software crisis
To your Ivory Towers!
Microsoft and Intel have put their money where their fear is. The two companies have shelled out $10m each to the University of California, Berkeley and the University of Illinois to fund research around advanced software programming techniques for PCs and mobile devices.
The grants mark a significant effort on the part of Intel and Microsoft to develop software that can run well across processors with many cores. Both companies need such software to make their future products more attractive, since few PC and mobile applications today can take advantage of the horsepower presented by an oncoming onslaught of chips that have 32-, 64- and even 128-cores. In trying to fix this issue now, the researchers and vendors argue they'll eventually be able to provide people with a new class of sophisticated programs.
"I think (the companies) get some credit here for seeing they won't solve these problems by themselves," famed Berkeley computer scientist Dave Patterson  told us. "It is one thing to tell the National Science Foundation that it should be funding this work. It's another thing to fund it yourselves - although you can think of it as enlightened self-interest."
To date, most PC applications have benefited from ever faster processors. Companies such as Microsoft could release new applications and expect that the software would run better as companies such as Intel and AMD issued improved microprocessors.
Due to a variety of issues, however, processor makers have been forced to advance their products by adding multiple processor cores to each chip rather than simply increasing the speed of the processors. As a result, software makers now need to write far more intricate code which can divvy up tasks across all of the processor cores instead of pumping linear jobs through one core. The complexity of writing such software will only increase in the coming years as developers must deal not with the four-core chips of today but with tens of cores per chip.
This so-called multi-threaded code is quite common in the server realm, but now companies along with computer scientists want to bring flashier programming techniques to the desktop and mobile markets. As a result, people may gain access to machines that can crank through tough jobs such as handling speech, video and 3-D objects with relative ease.
Microsoft, in particular, has been viewed as a laggard in the race toward multi-threaded applications. When told by Intel of the shift toward multi-core chips, Bill Gates remarked , "We can't write software to keep up with that." Gates then urged Intel to continue producing faster processors as it had always done. "No, Bill, it's not going to work that way," Intel vice president Pat Gelsinger responded.
Now it would seem that Microsoft realizes the severity of the software issues ahead, which must please Intel since it relies on improved software to create demand for its latest and greatest processors. To its credit, Intel has been funding a variety of programs around multi-threaded software and has issued some developer tools.
JASON and the Have-Nots
Intel and Microsoft opened up their funding competition to the top 25 computer science departments in the US. The two winners were joined by Stanford University and MIT in the last round of the contest. But in the end, the public universities won the hearts of the vendors over those wealthy elitists. (Stanford was apparently quite upset by this loss, as it seemed that someone from the university - Bill Dally, we're looking at you - leaked word of Berkeley's victory to a news outlet. An odd PR strategy to be sure, but Dally is a JASON , so he must know what he's doing.)
To help model future hardware, Berkeley will turn to its own RAMP systems , which combine hundreds of FPGAs (field programmable gate arrays) on a single board. Researchers can tweak the RAMP systems to mimic various processor architectures and test how their code will run across tons of cores. Ideally, the RAMP designs save on hardware costs and time by freeing researchers from pricey systems and by allowing them to speculate about future hardware rather than waiting for prototypes from hardware makers.
The funding will let 8 full-time faculty and more than 30 students on their way to PhDs research the multi-threaded software problems at Berkeley.
"There is a real sense of urgency that we need to figure this stuff out," Patterson told us.
Many of the researchers have been looking into these types of problems for years, and lots of work is already underway. For example, the Berkeley staff are exploring personalized medicine applications where a doctor or even patient could use a handheld device to simulate how various treatments might affect the blood. They're also examining systems that would project a sound field into a room "that recreates a concert in your living room as if you were at the event," said Krste Asanović, another professor at Berkeley. And there's work underway on a parallel browser that could make it possible to display and use much richer applications on your cellphone.
All of the work done at Berkeley will be released under open source software licenses, meaning that Microsoft or rivals could, say, pick up the browser code.
While the needs for this type of research appear obvious, some prominent technology players remain fond of yesteryear's methods.
'Putting the cart in front of the horse'
Linux figurehead Linus Torvalds, for example, recently attacked Berkeley's  parallelism efforts. Torvalds relegated the concerns around multi-threaded software to the niche of researchers.
"Designing future hardware around the needs of scientific computing seems ass-backwards," he wrote on a message board. "It's putting the cart in front of the horse."
Torvalds' comments shocked some of the Berkeley group.
"I was kind of staggered by that comment - that one of the leaders of computing sees the future in linear time computing," said Kurt Keutzer, a professor at Berkeley.
As the Berkeley set views things, the most interesting future applications will require far more horsepower than Torvalds seems to imagine.
Quite frankly, it must be a bit disheartening for Linux developers to witness Torvalds speak in such terms. He seems married to a past that's disappearing at pace with every release of a new processor.
Other critics might argue that we're heading toward a world where software returns to the data center rather than running on personal clients. The so-called Cloud Computing scenario would have servers do the serious work for users who just need relatively dumb, low-powered devices.
Patterson countered this argument by saying that a lack of pervasive network connections along with improvements in things such as flash memory will continue to make powerful client devices very attractive.
The Microsoft and Intel funding should help the researchers tackle these kinds of questions for five years. ®