ARM vet: The CPU's future is threatened
Moore's Law must be repealed
Hot Chips ARM's employee number 16 has witnessed a steady stream of technological advances since he joined that chip-design company in 1991, but he now sees major turbulence on the horizon.
"I don't think the future is going to be quite like the past," Simon Segars, EVP and head of ARM's Physical IP Division, told his keynote audience on Thursday at the Hot Chips conference at Stanford University, just north of Silicon Valley.
"There may be trouble ahead."
The microprocessor industry has enjoyed an almost unbroken streak of improvements, Segars said, citing advances in silicon manufacturing techniques, power reduction, and gadget-size and gadget-cost shrinkage – he brought along a 1983, $3,995 Motorola DynaTAC as a prop.
But the landscape is changing. The low-hanging fruit has been picked, and a new way of thinking will be needed to provide the world with the squillions of low-cost, low-power microprocessors that the increasingly mobile computing ecosystem requires – not to mention the everything-connected world described by the current buzz-phrase: "The internet of things".
Harkening back to when he joined ARM, Segars said: "2G, back in the early 90s, was a hard problem. It was solved with a general-purpose processor, DSP, and a bit of control logic, but essentially it was a programmable thing. It was hard then – but by today's standards that was a complete walk in the park."
He wasn't merely indulging in "Hey you kids, get off my lawn!" old-guy nostalgia. He had a point to make about increasing silicon complexity – and he had figures to back it up: "A 4G modem," he said, "which is going to deliver about 100X the bandwidth ... is going to be about 500 times more complex than a 2G solution."
As speeds increase, the chip complexity needed to achieve them skyrockets
The way that the 4G-modem problem will be solved, Segars said, will be by throwing a ton of dedicated DSP processing engines at it – which will, of course, require a lot of silicon real estate.
"But that's not so bad," he said, "because silicon's being scaled the whole time. But it's going to eat a lot of power, and power is the real problem."
ARM is a mobile-processor company, and mobile processors run on batteries – and Segar said that the power required to juice increasingly complex silicon is a system-level challenge. "The reason for that," he said, "is because batteries are pretty rubbish, really."
As silicon technologies have improved in comparative leaps and bounds, batteries haven't. "Historically," Segar said, "battery power has grown about 10 or 11 per cent per year, which unfortunately is not very well-matched with Moore's law."
Next page: Moore's law on trial
1x Dairy Milk Bar
1x fishing rod
1x treadmill with dynamo
1x 30-something single woman
Not all that portable admittedly, but I've got a patent pending on a nationwide network of charging stations :)
What on earth has happened here?
A thoughtful, intelligent, fascinating and well written article from which I learnt rather a lot. Without any jokes, satire or the faintest smell of clickbait in it. Have I logged on to the wrong site?
Is there actually a continuing market for slightly faster kit at higher cost in the current climate? IMHO most kit has been running fast enough for the last couple of years, despite constant efforts to force us to buy more CPU to support the same functionality.
Extreme gamers can link a few GPUs together, data warehousers can add terabytes of SD disk, and the rest of us can upgrade to Linux or Windows XP running Libre Office ;-)
This article suggests it's time for software to catch up with the hardware.
Back to the 70's then?
Maybe the way to make these devices to run faster is to tighten up the code. After all we've been getting rather a lot of bloat whilst Moore's Law has applied. In the 70's when processor time cost money it was a time when shaving the time off your code had a distinct advantage, and they didn't have cut'n'paste coders in that era.
I'd predict a trimming back of all those functions that don't get used unless it's the 5th Tuesday in February, to make what does get used rather a lot quicker.
Tux - possibly the home of better software.
Dedicated hardware best suited?
Isn't this rather obvious? The microprocessor exemplifies the concept of jack of all trades, master of none. Frankly the only reason my netbook is capable of showing me animé is because there is enough grunt power to decode the video data in real time. But then my PVR with a very slow ARM processor can do much the same as it pushes the difficult stuff to the on-chip DSP.
Likewise the older generation of MP3 players were essentially a Z80 core hooked to a small DSP, all capable of extracting ten hours out of a single AAA cell.
Go back even further, the Psion 3a was practically built upon this concept. Bloody great ASIC held an x86 clone (V30) and sound, display, interfacing, etc. Things were only powered up as they were actually required. In this way, a handheld device not unlike an original XT in spec could run for ages on a pair of double-As.
As the guy said, batteries are crap. Anybody who uses their smartphone like it's their new best friend will know that daily recharging is the norm, plus a car charger if using sat-nav. So with this in mind, it makes sense to have the main processor "capable" without being stunning, and push off complicated stuff to dedicated hardware better suited for the task, that can be turned off when not needed. Well, at least until we can run our shiny goodness on chocolatey goodness!