This article is more than 1 year old

Planning to throw capacity at an IT problem? Read this first

Commoditisation is changing the way we work

Not so long ago the axiom “you get what you pay for” held true in IT. More expensive hardware and software generally produced better results. With the rise of open source and open standards, this is no longer the case.

In today's world, if you'll pardon the mangled aphorism, IT isn't about how much you spend, but how you set about using it.

For the most part, computers are "good enough", and have been for some time. Business advantage today comes from knowing how to use technology to either accomplish a tangible task, or rummage around through data to deliver meaningful, actionable insights. Throwing more hardware/software at the problem rarely no longer gives an organisation an advantage.

There absolutely are problems out there where more is better, however, these tend to be fairly niche. More to the point, the use cases where raw grunt solves problems are overwhelmed by the number of cases where simply using what is to hand effectively is what is called for.

Copy Data Management (CDM) is a fairly straightforward example of this. Very often companies make more copies of data than they need and fail to delete copies when they are done. This results in inefficient use of storage resources which ultimately has real world consequences in terms of having to buy, manage, power and cool more gear.

A novel example

CDM, however, is easy street-level in this discussion. Let's ponder the novel The Dispossessed by Ursula Le Guin. Frequently billed as "anarchist utopia" novel, the anarcho-communist society discussed in the novel is a study in things IT.

The society in question is pathologically allergic to centralized planning or anything that smells like government. They have institutions of higher learning, but appear to be allergic to actually using them.

The society is facing a planetary drought. The rational approach would be doing a little legwork to find out how many desalination plants they need to survive, then building the bits in assembly line fashion and having their precious few "desalination experts" train some unskilled labour in assembly, testing, maintenance and so forth. Instead, the society sends their experts to where a desalination plant is to be built and builds them one at a time.

Essentially the entire society in the book very nearly stupids themselves into starvation, and indeed solves the problem by having a significant number of their own citizens agree to move to factory towns and die of starvation while making tools for agriculture.

Where The Dispossessed fits in with discussions about ineffectively used IT is that all the information to prevent the drought from being a societal disaster existed, they simply chose not to use it. Computers weren't even needed. The society had a surplus of unskilled labour. They could have manually tracked expertise (or lack thereof) against requirements on paper, used that data to encourage people to educate in given sectors and produced relevant equipment and installed it at the first sign of trouble. They didn't.

Basically, The Dispossessed is a novel about what happens to a society that refuses to use big data analytics for even the most critical problems. It's probably the most obvious use case ever, written decades before the buzzword was coined.

Back down to Earth

It's easy to see how CDM helps solve inefficiency. Infrastructure is tangible. You can wrap your arms around it. If a little bit of software can make you need less physical goods, that's really straightforward.

Similarly, the societal-level issues in The Dispossessed are exaggerated enough that the need for some basic data gathering and centralized planning slaps the reader in the face repeatedly. But what about subtler scenarios?

Marketing and sales types often run surveys of customers to extract information. How many of those really need to be run? How much of that information is buried in the data that the company already collects?

Engineering/production (hopefully) does lots of quality assurance testing on whatever hardware, software, services or physical goods are produced by an organization. How efficiently is that data analysed? Is data from production equipment gathered as well? How are variables compared? How much efficiency exists to be wrung out of the testing regime by simply analysing the existing data appropriately?

What about gathering new data? I've seen numerous instances where the ability to throw a few sensors at a problem allows something on a production line to be tweaked which ultimately reduces waste/failed units enough to avoid having to new and expensive)equipment. Not needing new equipment means delaying moving to a larger facility and all for the cost of a handful of sensors and some developer time to interpret the data.

For decades, we've been trained to solve IT problems by throwing capacity at them. We have become a society so addicted to digital excess that hundreds of billions of dollars are spent every decade on equipment and software that promises nothing more than to be ever so slightly more efficient than what we had been buying before.

Big data analytics, automation, orchestration...all of it starts with simply knowing what you are using, how and why. Perhaps – just perhaps – it's time to move out of the comfort zone of the capacity and performance-based refresh cycle. It may well be worth investing the time, money and (oftentimes considerable) effort to document what exists, how it is used and then working with experts to find ways to do better. ®

More about

TIP US OFF

Send us news


Other stories you might like