Feeds

Hey, software snobs: Hardware love can set your code free

The two go hand in hand when scaling data-crunching systems

Build a business case: developing custom apps

Comment In computing there are many, many different ways to run down other people’s work, not the least of which is: “OK, so they removed the bottleneck, but only by throwing faster hardware at it.”

The implication is that tackling an issue just with software is intrinsically better. When did you ever hear anyone say: “OK, so they removed the bottleneck, but only by better coding?"

The truth is that solving computing problems always involves both hardware and software; the trick is not to look for specific kinds of solution, but to find the most effective one. Of course "effective" in analytical terms can be defined in a host of different ways – cost, speed of implementation, reliability and so on.

And to make matters more complex the most effective solution to a given problem will vary with time. For example when Relational Online Analytical Processing (ROLAP) was all the rage, many star schemas were usually woefully poorly indexed. So the software fix of applying sensible indices was very effective – and much better than merely throwing hardware at the problem. These days, a more effective solution might be to chuck in a solid-state drive (SSD).

There can be no doubt that SSD technology will speed up the I/O of a system, many times over in some cases, and can be spectacularly better than spinning disk in terms of random reads and writes. And as SSDs continue to plummet in price and get faster, we expect to see them used more and more to replace delicate software hand-tuning such as indices, calculated redundant columns, horizontally and/or vertically splitting of tables and so on.

And, to switch to another valid measure of efficiency, we would argue that SSDs generally require less maintenance than software and fewer design tweaks like these and are therefore better.

This is not to suggest you should abandon software/design changes and use SSDs in all analytical systems, but it does illustrate the point that “throwing hardware” at a problem can be the correct solution.

Quick wins for accelerating the performance of software can be found by increasing the speed and capacity of the system memory. True, it’s volatile, so if the plug is pulled you lose all that lovely data, but analytical systems work almost exclusively on copies of the data. Memory capacity continues to rise as prices continue to drop and systems with many gigabytes of main memory are now common.

On the other hand, the volume of data continues to grow exponentially and we know that a great deal of data is accessed only rarely, so memory, while very useful, is never going to be the only answer.

For years we have got away with relying on CPUs acquiring ever-increasing numbers of transistors (just ask Gordon Moore) and becoming faster and faster. Recently however we have hit a hard limit in the speed of CPU cores and are resorting to using more of them to allow parallel processing. Resourceful types are finding good use cases for GPUs, chips architected originally for graphics processing, as these are designed to perform mathematical operations at high speed across many cores.

5 things you didn’t know about cloud backup

More from The Register

next story
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
Banking apps: Handy, can grab all your money... and RIDDLED with coding flaws
Yep, that one place you'd hoped you wouldn't find 'em
No, thank you. I will not code for the Caliphate
Some assignments, even the Bongster decline must
Barnes & Noble: Swallow a Samsung Nook tablet, please ... pretty please
Novelslab finally on sale with ($199 - $20) price tag
Video of US journalist 'beheading' pulled from social media
Yanked footage featured British-accented attacker and US journo James Foley
Primetime precrime? Minority Report TV series 'being developed'
I have to know. I have to find out what happened to my life
Broadband slow and expensive? Blame Telstra says CloudFlare
Won't peer, will gouge for Internet transit
Netflix swallows yet another bitter pill, inks peering deal with TWC
Net neutrality crusader once again pays up for priority access
prev story

Whitepapers

Best practices for enterprise data
Discussing how technology providers have innovated in order to solve new challenges, creating a new framework for enterprise data.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
How modern custom applications can spur business growth
Learn how to create, deploy and manage custom applications without consuming or expanding the need for scarce, expensive IT resources.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?