Feeds

Tell me, professor, what is big data?

A whole different dimension, says data scientist Mark Whitehorn

Build a business case: developing custom apps

Big Data may be misunderstood and overhyped - but the promise of data growth enabling a goldmine of insight is compelling. Professor Mark Whitehorn, the eminent data scientist, author and occasional Register columnist, explains what big data is and why it is important.

Sometimes life is generous and hands you an unexpected gift on a plate. Our esteemed editor asked me to write about how big data is misunderstood.

On the very day I was writing, the Grauniad reported that the Oxford English Dictionary has just defined big data as “data of a very large size, typically to the extent that its manipulation and management present significant logistical challenges".

Nooooo!!!!!!!! No. No. No. Really, no.

And I certainly don’t think that defining big data by the three 'V's (velocity, volume and variety) helps to add veracity, validation or value either. So, what is big data?

Little and large

In the beginning there was data. Data is just data.

Data is not large and it is not small
It does not live and it does not die
It does not offer truth and neither does it lie
(with apologies to Michael Moorcock)

In other words, data has always existed as both big data and small data (OK, so “small data” is not a real term, but it is useful here as a distinguisher).

But an often overlooked fact is that all data is very difficult to handle properly. We have used databases since the 1960s but it wasn’t until 1993 that we even understood transactions properly.

I interviewed the late, great, Jim Gray, who said: “I spent decades working on the problem of getting transaction integrity to work at all and on ACID [atomicity, consistency, isolation, durability] properties and how they can be implemented efficiently.”

Jim was awarded a Turing Award (the computing equivalent of a Nobel) for this work. The bottom line is that storing, editing and querying data properly is very, very difficult.

So we may have always had both big and small data but in the early days we focused on the data that was easiest to manipulate. This turned out to be the data which is simple (atomic) and fits neatly into columns and rows. In other words, we focused on data that fits neatly into tables. This is small data.

Another feature of small data is that we typically want to query it by simple sub-setting.

Think about a SQL query:

SELECT Name, DateOfBirth
FROM Employee
WHERE Salary > 40,000;

The FROM chooses the table, the SELECT subsets the columns and the WHERE subsets the rows.

SQL is exceptionally good at sub-setting and wildly incompetent at comparing sequential rows. This is because SQL is for set manipulation and sets are by definition unordered; there is no concept of sequential rows in a table.

Leave the table

So, for the past 40 years we have focused on the easy stuff – tabular data that can be manipulated by sub-setting. In other words, we have focused on small data.

One characteristic of big data is that it does not fit easily into tables – good examples are image and audio files.

Another is that we don’t want to query it by sub-setting; the queries are more complex than that. Some data has both of these characteristics, some just one.

As an example of the former, image files can be broken down into individual pixels and each one stored as a row in a table, but you probably don’t want to do that. Even if you did, a query looking for all the pictures that contain a vintage Bentley is not simply sub-setting by rows and columns.

As an example of the latter, sensor data can be tabularised quite easily (although you will end up with a very narrow, mind-bogglingly deep table), but the queries we run against this kind of data are typically looking for patterns formed by the consideration of sequential rows rather than sub-setting. So SQL is not the query language of choice here.

OK, so big data doesn’t fit well into tables and we need to be able to run queries that are more complex than sub-setting.

Why is big data worth pursuing?

At this point I usually start to describe social data analysis or scanning aerial photographs for hidden aeroplanes (both excellent examples). But I also love machinery and am very taken with the melding of big data and combine harvesters. So we will look at that.

Secure remote control for conventional and virtual desktops

Next page: Mean machine

More from The Register

next story
Kate Bush: Don't make me HAVE CONTACT with your iPHONE
Can't face sea of wobbling fondle implements. What happened to lighters, eh?
Video of US journalist 'beheading' pulled from social media
Yanked footage featured British-accented attacker and US journo James Foley
Caught red-handed: UK cops, PCSOs, specials behaving badly… on social media
No Mr Fuzz, don't ask a crime victim to be your pal on Facebook
Ballmer leaves Microsoft board to spend more time with his b-balls
From Clippy to Clippers: Hi, I see you're running an NBA team now ...
Online tat bazaar eBay coughs to YET ANOTHER outage
Web-based flea market struck dumb by size and scale of fail
Amazon takes swipe at PayPal, Square with card reader for mobes
Etailer plans to undercut rivals with low transaction fee offer
Assange™: Hey world, I'M STILL HERE, ignore that Snowden guy
Press conference: ME ME ME ME ME ME ME (cont'd pg 94)
Call of Duty daddy considers launching own movie studio
Activision Blizzard might like quality control of a CoD film
US regulators OK sale of IBM's x86 server biz to Lenovo
Now all that remains is for gov't offices to ban the boxes
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Build a business case: developing custom apps
Learn how to maximize the value of custom applications by accelerating and simplifying their development.
Rethinking backup and recovery in the modern data center
Combining intelligence, operational analytics, and automation to enable efficient, data-driven IT organizations using the HP ABR approach.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.