Feeds

Intel's lab crew makes case for 80-core world

CSI: It's optical

Remote control for virtualized desktops

Exclusive I have seen the future. It's full of Agilent testing equipment, clunky Nvidia drivers and enthusiastic, well-educated men.

The future didn't always look this way - at least not at Intel, where "GHz=God" wallpaper used to cover cubicle walls. The chip maker once indoctrinated workers with the religion of speed and did everything possible to convince consumers that a 2.0GHz chip made life so much more bearable than a 1.8GHz part. Intel relied on GHz tweaks to feel good about itself and thought about speed, speed, speed all the time.

Under these conditions, Intel's future appeared white hot. Well, actually, it was more of a rocket nozzle/surface of the sun hot. You all remember the slide Intel's then VP Pat Gelsinger would toss out to show just how hot Intel could make a chip in two, five and ten years' time.

But then the industry shifted to multi-core chips where the so-called "platform" matters more than GHz. Getting consumers and software makers to embrace the "platform" mentality takes some serious work. At Intel, it's the labs teams that have accepted the brain-bending challenge.

Terable PCs

Where the Intel of today has four-core processors, the Intel of tomorrow will have 80-core, 100-core and 120-core chips.

Explaining the need for so many cores proves easy enough for the server set. Customers at research labs and giant companies always want more horsepower.

These clients are already being fed a decent helping of multi-threaded software that can fly across the complex, multi-core processors. Myriad benchmarks exist that show the software performance improvements derived from multi-core chips. Real world customer code often tells the same story. So, while some still long for relentless single thread boosts, the corporate majority has accepted the multi-core future.

The case for so-called terascale processors on the PC seems tougher to make.

Intelligent flash storage arrays

Whitepapers

Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.