Feeds

What kind of Big Data is yours? Is it data bauxite, data aluminium ... or data Dreamliner?

And can it beat a psychic octopus

Secure remote control for conventional and virtual desktops

Analysis Data is valuable. There, we’ve said it, do you feel better? The question is, has data as an information currency - and an entity in and of itself - become inherently more valuable?

Now that we have real time transactional big data analytics to enrich our lives, does this mean that the 1s and 0s inside every binary now somehow equate to a higher cost per pound, dollar and kilo than before? Is one type of data more valuable than another?

Yes, obviously, you could say that unstructured data is worth less than structured data because we have already applied some taxonomy and categorisation to it, but is this wholly fair and correct?

Structured data is post-production processed data most of the time. Unstructured data is (potentially) still full of hidden treasure and just as valuable; we just haven’t mined, brined and refined it yet.

Those pushing the Big Data agenda are intrinsically arguing unstructured data is more valuable and they are the ones who can help you unlock its value.

“They” being pushers of analytics tools, MapReduce and NoSQL-databases, plus, increasingly, relational technologies and platforms that bridge into NoSQL and MapReduce.

Surely there must be some way to push this madness back into its box. Some kind of Economist-style Big Mac Index; the alpha nerds' idea of a “fun” exercise to ascribe universal value to currencies based on the gold standard of the US dollar.

Alas, there is no “international data value scale”, but surely there needs to be one. It’s something Gartner periodically speculates about, while "data evangelists" at big systems companies bang on about the “value” of data.

What we really need is a gauge and hierarchy of some kind. God forbid, even a Magic Quadrant would have been a start. (OK, maybe that's going too far.)

It's all the same when you squint at it

The trouble is, any two datasets are essentially heterogeneous. This core truism means that market valuation of data is not always appropriate. Is there some way of assessing the inner worth of data so that we can treat it with appropriate worth and gravitas and potentially even trade with it?

Business analytics company SAS UK & Ireland may have come closest to the mark by commissioning its Data On The Balance Sheet study that was carried out by the Centre for Economics and Business Research (Cebr) from June last year. The report discussed the economic arguments for placing “data as an asset” on the company balance sheet and in the wider financial reporting framework.

“Data does not have a physical presence and therefore may be considered to have an infinite life when compared alongside physical assets. However, data can depreciate quickly if it is readily outdated. Nevertheless, data must be accounted for," says the report.

As SAS reminds us, data is an “intangible private asset” in terms of the modern balance sheet. Or, in longer form, “data as an asset is defined as any resource controlled by a company which generates future economic benefits and has an associated cost or value which can be reliably measured.”

So much for economics, what does the industry think?

“Data itself has some value, but it's like an uncut gem. Should there be a scale? Certainly yes for things that are time critical. We should also be able to measure how trustworthy data is (for things like online reviews). But I've not seen any customer place a physical dollar value on data,” said Matt Smith, chief technology officer for the Northern EMEA region at Software AG.

Somebody who’s actually put a value on data is the UK chief technology officer of Rackspace, Nigel Beighton. He therefore disagrees with Smith.

“If data is lost or stolen it has a direct value in terms of governmental fines and the cost of regenerating it. Secondly, I don’t think any retailer today would NOT know the incremental value of greater data accuracy down to the penny and the pound,” he said.

“When I worked at lastminute.com we knew the exact cash value of search returns for any given type of journey request on the site. The trick is knowing what you want to do with any piece of data, then the value of it (or some notional meaningful significance) is easy to understand. If data can not be ascribed to a specific task or process, then yes, essentially it remains valueless until that point,” added Beighton.

Beighton wouldn’t be pushed on what actual values lastminute.com put on its data.

But the realm we seem to be moving into is rather than a value on the generic zeros and ones, value comes from the information that that zeros and ones based on where they hail from or the subject matter they cover or the insights they reveal.

Adding value to processed data = £££

Matt Ballantine, founder of Stamp London, a digital consultancy, goes along with this analysis. According to Ballantine, the value of the data comes from how it’s used.

“Bauxite, for example, sells at around $45/ton and aluminum for around $2,000 a ton. In the form of a Boeing 787 Dreamliner - OK, I'm taking a few liberties to make the point - it’s about $1.8m per ton. Where's the value? Processing, design and manufacture,” Ballantine says.

He adds a reality check for all those big-data zealots and tech companies who think they’ve got analysis cracked using their clustered servers and machine learning.

“There was lots of PR buzz this week surrounding the fact that Microsoft Cortana accurately predicted the results of 15 out of 16 World Cup matches. That's great. But Paul the Octopus managed 12 out of 14, which statistically, ain't that far off. Perhaps the real value of big data analytics is only marginally better than a psychic octopus," he notes.

Do we need a data value scale then? Or should we rest easy in the knowledge that data needs a purpose before it can be measured to a certain value scale?

Let’s resurrect Paul the Octopus from the dead, point him at a pile of binary numbers and work from there. ®

Internet Security Threat Report 2014

More from The Register

next story
Azure TITSUP caused by INFINITE LOOP
Fat fingered geo-block kept Aussies in the dark
NASA launches new climate model at SC14
75 days of supercomputing later ...
Yahoo! blames! MONSTER! email! OUTAGE! on! CUT! CABLE! bungle!
Weekend woe for BT as telco struggles to restore service
You think the CLOUD's insecure? It's BETTER than UK.GOV's DATA CENTRES
We don't even know where some of them ARE – Maude
Cloud unicorns are extinct so DiData cloud mess was YOUR fault
Applications need to be built to handle TITSUP incidents
BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
Don't worry about that cable, it's part of the config
Stop the IoT revolution! We need to figure out packet sizes first
Researchers test 802.15.4 and find we know nuh-think! about large scale sensor network ops
DEATH by COMMENTS: WordPress XSS vuln is BIGGEST for YEARS
Trio of XSS turns attackers into admins
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?