Feeds

Intel's lab crew makes case for 80-core world

CSI: It's optical

Business security measures using SSL

True enough, video game developers, companies such as Adobe with Photoshop and operating system makers can use multi-core chips for serious speedups. But why would the average consumer want to throw 80 cores at Word and Microsoft's excuse for a browser, Internet Explorer?

Intel's labs crew have developed a number of applications meant to demonstrate what a consumer could get out of an 80-core dynamo.

A well-cored computer could, for example, make those tedious home movies more tolerable both for the family producing the films and people subjected to screenings of "The Day Johnny Ate a Lollipop by Himself."

Intel and partners, for example, have created software that eliminates jitter from video recordings.

The wife's grand ski run has never looked better with her top form coming through clearly despite your shaky hands. Soon, a company such as YouTube could offer the jitter removal as an option, and the home PC could do the dirty work crunching code to improve the clip.

In the same arena, Intel has mastered an application that can scan a lengthy home video of, say, your son's soccer football game and pull out the highlights from when the youngling scored a goal or maimed that jerk kid from down the block. Intel's code searches the video for spikes in cheering or fierce on field activity to locate the best bits of a game. It can also zero in on individual players and track them throughout the contest.

"Grandma is not going to sit there for an hour watching the game, but she will sit there and watch a five-minute highlight reel," one Intel lab staffer told us, during our recent visit to the company's Santa Clara nerdery.

Intel also showed off some software that lets users manipulate objects on a screen with the aid of a PC camera. You can see one example of this software in action below – vulture bubbles always an excellent choice.

The Games People Play

While Intel hopes to gain the attention of customers outside of the gaming set, it can't help but show off what a terascale chip would do for a first person shooter.

Intel has spent the last couple of years pumping the idea that ray tracing – a rendering technique that provides more sophisticated interactions between light and objects - will conquer raster graphics.

Despite pursuing its own graphics chip agenda, Intel contends that mainstream, multi-core x86 chips will bring video games to life in ways that the GPU crowd cannot match. The physics calculations demanded by techniques such as ray tracing make a “general purpose” x86 chip ideal for producing the graphics-rich games of tomorrow.

“GPUs are not set up for the general purpose types of workloads that we're talking about,” noted one of Intel's researchers. “Physics is one of the most general workloads, and those types of calculations will fly on (the multi-core) chips.”

Besides being better suited to certain workloads, general purpose chips from Intel boast large on-chip memory stores. The limited local memory of GPUs hampers the products from “rendering complex scenes” in games, according to Intel.

Damn you developers, developers, developers

Even if Intel can convince consumers, game developers and the like of the terascale chips' merits, the company faces a huge coding conundrum.

Relatively few coders know how to take advantage of multi-core chips, and most of the folks who do know the way of multi-core are spending their time pushing out server code. So, it may be the case that Intel dumps a fantastic, super-powerful chip on the market and then has to wait years for Microsoft and crew to write software that flies on the silicon. To help fix this situation, Intel has put research and development dollars in the hands of tens of universities. It hopes to encourage the schools to set up multi-threaded coding courses.

Beyond software, there are more problems as well – this time for the hardware crew. (And with that, I give you a forbidden journalistic technique known as “burying the lede.”)

Secure remote control for conventional and virtual desktops

Next page: CSI Drama

More from The Register

next story
Oi, Tim Cook. Apple Watch. I DARE you to tell me, IN PERSON, that it's secure
State attorney demands Apple CEO bows the knee to him
4K-ing excellent TV is on its way ... in its own sweet time, natch
For decades Hollywood actually binned its 4K files. Doh!
Phones 4u website DIES as wounded mobe retailer struggles to stay above water
Founder blames 'ruthless network partners' for implosion
DARPA-backed jetpack prototype built to make soldiers run faster
4 Minute Mile project hatched to speed up tired troops
Hey, Mac fanbois. HGST wants you drooling over its HUGE desktop RACK
What vast digital media repository could possibly need 64 TERABYTES?
Apple's Watch is basically electric perfume
It isn't just me-too Apple that's lost its lustre: Gadget mania is over
In a spin: Samsung accuses LG exec of washing machine SABOTAGE
Rival electronic giant tries to iron out allegations
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.
Protecting users from Firesheep and other Sidejacking attacks with SSL
Discussing the vulnerabilities inherent in Wi-Fi networks, and how using TLS/SSL for your entire site will assure security.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.