Feeds

Intel Tera – not firma, but coming

Is parallel processing the step beyond?

  • alert
  • submit to reddit

New hybrid storage solutions

Here is a long-term question: is Intel starting the long walk away from the x86 architecture and towards what it sees as the "next big thing"?

There is certainly no immediate need to panic. Nothing is going to change for a good few years yet, but there has to come a time when the aged and venerable x86 fundamentals are put out to pasture to make way for something better.

That something is most likely to be in the realm of parallel processing, if recent developments from Intel are any guide, and that is a world which will be radically different from anything developers with x86 experience (and most other architectures for that matter) will have encountered before.

Two pointers have emerged from Intel over the last year. Last February the company bought Conformative Systems, a US start-up that had not got very far commercially, but had plenty of interesting development work done on an XML accelerator server built round a specially designed parallel processor of 16-cores running as "grid on a chip".

The company recently announced that it has "developed the world's first programmable processor that delivers supercomputer-like performance from a single, 80-core chip not much larger than the size of a fingernail, while using less electricity than most of today' home appliances".

"This is the result of the company's innovative 'Tera-scale computing' research aimed at delivering Teraflops - or trillions of calculations per second - of performance for future PCs and servers."

OK, so doing this research work is one helluva long step away from having a product, but to publicly shout about it means Intel wants to start some sort of ball rolling. And with AMD breathing down its neck and eating its breakfast, lunch and dinner in the x86 market, one of the best ways of fighting back is to move the goalposts and start a different fight. To which thoughts must be added to the notion that the x86 architecture is long in the tooth, is contributing to the production of increasingly obese systems, and can't go on forever.

What's more, other vendors are starting to appear on the parallel processing horizon and are already addressing one of the big issues with such new devices – how the hell you program them.

Parallel processing means, or should mean, that the rules developers work to will be significantly different from those in common use today. That may not be the case, however. Stream Processors Inc, a fabless semiconductor startup out of Standford and MIT Universities in the USA, has come up with a parallel processing architecture that can be programmed in C. Its target market will be applications involving large scale image manipulations – which increasingly could mean just about anything.

What this also points to, however, is a possible trend towards more application – or function/process – specific servers or devices. Though Intel has undoubtedly found the parallel processing design skills of Conformative useful, one of the other stated reasons for buying the business was the XML acceleration capabilities - a function that fits into the Intel business model well. This is where it takes a function - graphics, for example - that was a separate, application specific card, and then first integrates it into the motherboard chip set, and then integrates it into the processor. With parallel processing, it may well be simpler and more elegant architecturally, to keep application specific servers at least logically separate.

Either way, the issue for developers looking to where their futures lie is that Intel has now made it very public that it thinks parallel is a strong contender. How strong may come clearer at the upcoming Solid State Circuits Conference where the Tera-scale technology will be discussed for the first time.

A measure of the company's seriousness – and progress – will be if it also mentions, even in passing, such words as "compiler". If it does, this could be a good few years nearer than we currently imagine. ®

Security for virtualized datacentres

More from The Register

next story
Not appy with your Chromebook? Well now it can run Android apps
Google offers beta of tricky OS-inside-OS tech
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
Greater dev access to iOS 8 will put us AT RISK from HACKERS
Knocking holes in Apple's walled garden could backfire, says securo-chap
NHS grows a NoSQL backbone and rips out its Oracle Spine
Open source? In the government? Ha ha! What, wait ...?
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
Top 5 reasons to deploy VMware with Tegile
Data demand and the rise of virtualization is challenging IT teams to deliver storage performance, scalability and capacity that can keep up, while maximizing efficiency.
Reg Reader Research: SaaS based Email and Office Productivity Tools
Read this Reg reader report which provides advice and guidance for SMBs towards the use of SaaS based email and Office productivity tools.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.
Secure remote control for conventional and virtual desktops
Balancing user privacy and privileged access, in accordance with compliance frameworks and legislation. Evaluating any potential remote control choice.