Feeds

Exploding core counts: Heading for the buffers

Ferrari engine meet go-kart

The essential guide to IT transformation

Gartner's analysis does, of course, leave out one important issue. The main bottleneck on system performance is arguably - and man, do people argue about this - the limits on main memory capacity and bandwidth inside systems. In many cases, customers upgrade server platforms not because they need more CPU cores, but because they want both more memory and more bandwidth into and out of the CPUs.

Moreover, for some workloads - this is particularly true of online transaction processing - the amount of work a machine can do is more affected by the number of disk drive arms and the bandwidth in the disk subsystems than other factors, like the number of processor cores. In benchmark tests, server makers can get their server processors running at 95 per cent or higher utilization, but it is a very well run big iron box running Unix that can consistently stay at even a 60 to 70 per cent utilization rate running OLTP workloads.

I/O and memory bandwidth issues keep the processors tapping their feet, waiting for data. IBM's mainframe operating systems and middleware, as well as end user applications have been tuned and tweaked over decades to wring every ounce of performance out of the box and run at 90 per cent or higher utilization rates in production environments, but if you paid five or ten times the amount it costs to buy an RISC or x64 server, you would spend a lot of dough on tuning, too. And having done all that work, you would sure as hell think twice before moving those applications off the mainframe. Which is why mainframes persist.

The biggest issue, it seems, is that memory speeds have not even come close to keeping pace with processor speeds, which has been mitigated to a certain extent by the thermal wall that processors have hit. This is giving memory speeds a chance to catch up, perhaps. But the fastest DDR3 memory on the market still tops out at 1.3 GHz, and that is still less than half the speed of, say, a Nehalem Xeon processor that will hit the streets later this quarter. And even if you could get the speeds of CPU cores and memory in line, that doesn't solve the capacity issue.

Memory DIMMs can only be so small at a certain price per capacity, and motherboard makers can only put so many wires on the board for memory at a price. The memory issue is not going away. But solving this will perhaps be easier than coping with software stacks that don't understand how to make use of so many threads. ®

Boost IT visibility and business value

More from The Register

next story
Pay to play: The hidden cost of software defined everything
Enter credit card details if you want that system you bought to actually be useful
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
HP busts out new ProLiant Gen9 servers
Think those are cool? Wait till you get a load of our racks
Silicon Valley jolted by magnitude 6.1 quake – its biggest in 25 years
Did the earth move for you at VMworld – oh, OK. It just did. A lot
VMware's high-wire balancing act: EVO might drag us ALL down
Get it right, EMC, or there'll be STORAGE CIVIL WAR. Mark my words
Forrester says it's time to give up on physical storage arrays
The physical/virtual storage tipping point may just have arrived
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.