Feeds

Big Data: Why it's not always that big nor even that clever

And as for data scientists being sexy, well...

Providing a secure and efficient Helpdesk

Can you imagine your bank calculating your current account using tweets and Facebook posts?

The other change contributing to the purported Big Data paradigm-shift is an explosion in the variety of data. As mentioned above, companies like Google and Facebook have to juggle and create “graphs” for profiles or demographic information from a very large number of sources in as many formats. But this certainly isn’t true of everyone. When people talk about these new, messy collections of unstructured data, they’re almost always referring to data sourced from social networks and blogs.

Will core systems used at banks (to do transaction-processing, say, an area still completely dominated by old-school relational databases) really need to use social-media data? What about inventory systems, or digital catalogs, or systems used by cancer researchers? And what about data that, for whatever reason, can’t be processed in a distributed, stateless way?

Highly unstructured data still occupies a specialized and relatively tiny niche, though it’s a very flashy one. And it’s not as if no system in the world has had to do parsing, translating, or pre-processing to merge data from multiple sources before.

If a company that’s been around for years suddenly argues that it needs Big Data techniques to run its business, it must mean that either the nature of that business has changed fundamentally overnight or it's been hobbling along forever with systems that don’t quite work. Either of those claims would be hard to believe.

Make your system scalable and, hey presto, you can do Big Data

The term Big Data is often used sloppily, if not downright incorrectly. The mere increase in the amount of data being made available for a given application, no matter how big that increase may be, doesn’t automatically make it a Big Data application. The system will need to handle more capacity, but that might require nothing more than a few design tweaks, and maybe not even that if the system was designed to be scalable.

One article I read recently on the subject of textual analysis (looking for patterns across all books by a given author, for example) gave the impression that this was something that was never done before, because it couldn’t be.

But computer-based textual analysis has been going on forever. I remember being fascinated by studies of the frequencies and patterns of words in Shakespeare’s plays in the 1970s. What is new, if anything, is the number of texts available in digital form, which may itself have spurred interest in super-large-scale book-crunching.

If these texts were available 20 years ago, and there was interest then in looking for these kinds of patterns, it’s almost inconceivable that scientists would have thrown up their hands and said, “we just can’t do it with the technology we have.” And even without knowing the details of how these kinds of analysis are being done today, I find it very unlikely that there aren’t traditional sorting and searching algorithms used somewhere in the code. There’s nothing inherently Big Data about this.

As for the gesture-level information being collected by Netflix for marketing or advertising purposes (say, what parts of a movie people were likely to skip over, where they paused, or what scenes they watched repeatedly), that data is being analyzed and possibly used for nefarious purposes simply because it’s available.

If DVD-rental companies could somehow have seized that kind of low-level information in the 1990s they probably would have been capable of analyzing it, if they’d chosen to. What’s responsible for this disturbing trend is the ability of movie-rental companies to capture gesture information, their relatively new interest in using it, and the collaboration between companies to fit their individual pieces of these social puzzles together.

“Data” hasn’t become the root of all evil overnight, any more than it’s become the only thing that matters. And blaming Big Data for everything that’s wrong with the world is no better than scrapping all your existing, non-Big Data technology because it’s suddenly “obsolete.” ®

Providing a secure and efficient Helpdesk

More from The Register

next story
Scrapping the Human Rights Act: What about privacy and freedom of expression?
Justice minister's attack to destroy ability to challenge state
WHY did Sunday Mirror stoop to slurping selfies for smut sting?
Tabloid splashes, MP resigns - but there's a BIG copyright issue here
Google hits back at 'Dear Rupert' over search dominance claims
Choc Factory sniffs: 'We're not pirate-lovers - also, you publish The Sun'
EU to accuse Ireland of giving Apple an overly peachy tax deal – report
Probe expected to say single-digit rate was unlawful
Inequality increasing? BOLLOCKS! You heard me: 'Screw the 1%'
There's morality and then there's economics ...
Hey Brit taxpayers. You just spent £4m on Central London ‘innovation playground’
Catapult me a Mojito, I feel an Digital Innovation coming on
While you queued for an iPhone 6, Apple's Cook sold shares worth $35m
Right before the stock took a 3.8% dive amid bent and broken mobe drama
EU probes Google’s Android omerta again: Talk now, or else
Spill those Android secrets, or we’ll fine you
prev story

Whitepapers

A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Beginner's guide to SSL certificates
De-mystify the technology involved and give you the information you need to make the best decision when considering your online security options.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.