Feeds

Who wants a 'robot companion'? Look no further than Intel Labs

Sci-fi predicts the future, says Chipzilla CTO

High performance access to file storage

Interview Can you imagine IBM Research ever developing a social robot companion? Intel CTO Justin Rattner can’t, but he’s happy for his own researchers to build one – and for the technology to find its way into the market. Eventually.

Ask most techies of a certain age how a company should carry out research and development and they will likely cite Bell Labs, or one of its clones such as Xerox Parc or IBM Research.

The problem is, says Rattner, while this traditional model continues to shape perceptions of how to run research, it is pretty much dead in the water. Indeed, as Rattner pointed out in a speech at Intel’s Open Innovation 2.0 event in Dublin recently, it’s debatable whether it was ever really suited for taking “inventions” and turning them into “innovations” - or, put another way, into something people will buy and use.

“Bell Labs’ model was basically to do academic research without the burdens of having to teach or having to convince your government masters that you had worthy research to do,” Rattner told The Register in a conversation after his speech.

This is fine as far as it went, Rattner argues, but it also meant that while Bell Labs laid the groundwork for the point contact transistor, Ma Bell failed to do anything with it. While William Shockley jumped ship to set up his own operation to further transistor research, he built it in the image of Bell Labs.

It took another set of defectors to set up Fairchild Semiconductor before the transistor started resembling anything that could form the building blocks on an integrated circuit. The rest is (Intel) history.

R&D: Evolve or die

Things started to change in the 90s, even as the rapidly fattening Microsoft was scaling up its own Microsoft Research operation, picking up refugees from academia as well as Xerox, DEC, et al. As Rattner says, boards were asking CEOs: “Are you getting value out of your research organisation, or would it make sense to just shut the thing down?”

“We had the same pressure at Intel. There’d been a research organisation separate from the labs associated with the chip technology for many years, but it wasn’t held in very high esteem and was generally thought of as an ivory tower.”

“It was the middle of the decade and we were thinking about how we should structure research at Intel so that … the lab's impact would be seen as the critical - if not the primary - engine for innovation in the company. And we think we succeeded in doing that.”

At IBM, says Rattner, Lou Gerstner handed the research purse strings to the product units, leaving researchers touting for budgets.

As for Intel, “The original metric that [former CEO Paul] Otellini established was ‘OK, how many technologies are moving out of the labs and moving into the product section’. It’s actually part of the executive incentive program. It had to be.”

“And then he [Otellini] said, ‘you can transfer these technologies, but I want to know that these technologies are going to be in those products.”

“That was when he really raised the bar - but it was the right thing to do because it made everybody focus: 'this is not just about getting the technology from A to B, this is about getting the technology to the market.'”

This very process of drawing a direct line between pure research and products in the market could be seen as the fast route to underinvestment in R&D which many US firms, and Universities, are accused of these days. Scientists’ drive to do blue sky research is supposedly being trumped by short-termism, and shareholders with a time horizon of a quarter at the most.

Bringing research to market

Rattner, unsurprisingly, would say this is not the case at Intel, and points out that Labs works ahead of product group demands - to the extent of funding its own “ventures” to get technology off the starting blocks, where necessary. There are three of these Lab Ventures in progress at the moment, Rattner says, though the only one which he will discuss in public is Silicon Photonics.

“That’s a lab venture and it basically was created to take roughly a decade’s worth of research in silicon photonics and bring it to market.”

According to Rattner, when Labs went to Intel’s network business and started talking up the prospect of 100Gbps, the response was, “you know guys, that’s all great stuff, but there’s no need right now, we’re just trying to do 40Gbps...”

Rattner explains: “When it costs less than 4x the price of the current technology but gives you 10x the performance, the market shifts and this is what delayed 10Gbps. It took so long they just couldn’t get the cost to 4x what 1Gbps technology was at.”

From a product point of view, the numbers didn’t add up for silicon photonics yet. While there was undoubtedly demand from HPC volumes that would have been too low justify a new business, says Rattner.

“But you know, as we were out talking to the big data centre customers, they were telling us they’re going to need this stuff much sooner and they were going to need it in very high volumes.”

So, Rattner’s team began showing potential customers 50Gbps laboratory technology.

“We said, if we could figure out how to manufacture it in higher volume, would you buy this stuff? And they were very encouraging. And in fact as part of getting funding [from Intel’s venture board] we had to pull in two MOUs to our venture board [saying] if you can do this, we’re ready to buy it. And that was the basis for starting the business

The photonics technology was demo’d at IDF Beijing recently, and Rattner says Intel has engineering samples, was building up yields, and “We’ll be in the market within the year.”

“From time to time to a small number of technologies come more out of the exploratory side of our research that either don’t have a home or it’s too early for them to have a home and they’re obvious candidates for venturing.”

The drive to productise Silicon Photonics has thrown off other developments, he adds.

“We could spend the next hour just talking about testing,” says Rattner. We don’t, but he does raise a good question:

“I mean, how do you test an optical integrated circuit - it turns out you can’t buy standard testers so we had to invent our own.”

High performance access to file storage

More from The Register

next story
Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
Helium seeps from Falcon 9 first stage, delays new legs for NASA robonaut
Solar-powered aircraft unveiled for round-the-world flight
It's going to be a slow and sleepy flight for the pilots
Russian deputy PM: 'We are coming to the Moon FOREVER'
Plans to annex Earth's satellite with permanent base by 2030
LOHAN's Punch and Judy show relaunches Thursday
Weather looking good for second pop at test flights
Saturn spotted spawning new FEMTO-MOON
Icy 'Peggy' looks to be leaving the outer rings
Discovery time for 200m WONDER MATERIALS shaved from 4 MILLENNIA... to 4 years
Alloy, Alloy: Boffins in speed-classification breakthrough
India's GPS alternative launches second satellite
Closed satnav system due to have all seven birds aloft by 2016
Curiosity finds not-very-Australian-shaped rock on Mars
File under 'messianic pastries' and move on, people
Top Secret US payload launched into space successfully
Clandestine NRO spacecraft sets off on its unknown mission
prev story

Whitepapers

Mainstay ROI - Does application security pay?
In this whitepaper learn how you and your enterprise might benefit from better software security.
Five 3D headsets to be won!
We were so impressed by the Durovis Dive headset we’ve asked the company to give some away to Reg readers.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Mobile application security study
Download this report to see the alarming realities regarding the sheer number of applications vulnerable to attack, as well as the most common and easily addressable vulnerability errors.