Feeds

Big Data: Why it's not always that big nor even that clever

And as for data scientists being sexy, well...

Intelligent flash storage arrays

Can you imagine your bank calculating your current account using tweets and Facebook posts?

The other change contributing to the purported Big Data paradigm-shift is an explosion in the variety of data. As mentioned above, companies like Google and Facebook have to juggle and create “graphs” for profiles or demographic information from a very large number of sources in as many formats. But this certainly isn’t true of everyone. When people talk about these new, messy collections of unstructured data, they’re almost always referring to data sourced from social networks and blogs.

Will core systems used at banks (to do transaction-processing, say, an area still completely dominated by old-school relational databases) really need to use social-media data? What about inventory systems, or digital catalogs, or systems used by cancer researchers? And what about data that, for whatever reason, can’t be processed in a distributed, stateless way?

Highly unstructured data still occupies a specialized and relatively tiny niche, though it’s a very flashy one. And it’s not as if no system in the world has had to do parsing, translating, or pre-processing to merge data from multiple sources before.

If a company that’s been around for years suddenly argues that it needs Big Data techniques to run its business, it must mean that either the nature of that business has changed fundamentally overnight or it's been hobbling along forever with systems that don’t quite work. Either of those claims would be hard to believe.

Make your system scalable and, hey presto, you can do Big Data

The term Big Data is often used sloppily, if not downright incorrectly. The mere increase in the amount of data being made available for a given application, no matter how big that increase may be, doesn’t automatically make it a Big Data application. The system will need to handle more capacity, but that might require nothing more than a few design tweaks, and maybe not even that if the system was designed to be scalable.

One article I read recently on the subject of textual analysis (looking for patterns across all books by a given author, for example) gave the impression that this was something that was never done before, because it couldn’t be.

But computer-based textual analysis has been going on forever. I remember being fascinated by studies of the frequencies and patterns of words in Shakespeare’s plays in the 1970s. What is new, if anything, is the number of texts available in digital form, which may itself have spurred interest in super-large-scale book-crunching.

If these texts were available 20 years ago, and there was interest then in looking for these kinds of patterns, it’s almost inconceivable that scientists would have thrown up their hands and said, “we just can’t do it with the technology we have.” And even without knowing the details of how these kinds of analysis are being done today, I find it very unlikely that there aren’t traditional sorting and searching algorithms used somewhere in the code. There’s nothing inherently Big Data about this.

As for the gesture-level information being collected by Netflix for marketing or advertising purposes (say, what parts of a movie people were likely to skip over, where they paused, or what scenes they watched repeatedly), that data is being analyzed and possibly used for nefarious purposes simply because it’s available.

If DVD-rental companies could somehow have seized that kind of low-level information in the 1990s they probably would have been capable of analyzing it, if they’d chosen to. What’s responsible for this disturbing trend is the ability of movie-rental companies to capture gesture information, their relatively new interest in using it, and the collaboration between companies to fit their individual pieces of these social puzzles together.

“Data” hasn’t become the root of all evil overnight, any more than it’s become the only thing that matters. And blaming Big Data for everything that’s wrong with the world is no better than scrapping all your existing, non-Big Data technology because it’s suddenly “obsolete.” ®

Internet Security Threat Report 2014

More from The Register

next story
Facebook pays INFINITELY MORE UK corp tax than in 2012
Thanks for the £3k, Zuck. Doh! you're IN CREDIT. Guess not
Facebook, Apple: LADIES! Why not FREEZE your EGGS? It's on the company!
No biological clockwatching when you work in Silicon Valley
Happiness economics is bollocks. Oh, UK.gov just adopted it? Er ...
Opportunity doesn't knock; it costs us instead
Sysadmin with EBOLA? Gartner's issued advice to debug your biz
Start hoarding cleaning supplies, analyst firm says, and assume your team will scatter
YARR! Pirates walk the plank: DMCA magnets sink in Google results
Spaffing copyrighted stuff over the web? No search ranking for you
Don't bother telling people if you lose their data, say Euro bods
You read that right – with the proviso that it's encrypted
Apple SILENCES Bose, YANKS headphones from stores
The, er, Beats go on after noise-cancelling spat
prev story

Whitepapers

Cloud and hybrid-cloud data protection for VMware
Learn how quick and easy it is to configure backups and perform restores for VMware environments.
A strategic approach to identity relationship management
ForgeRock commissioned Forrester to evaluate companies’ IAM practices and requirements when it comes to customer-facing scenarios versus employee-facing ones.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Three 1TB solid state scorchers up for grabs
Big SSDs can be expensive but think big and think free because you could be the lucky winner of one of three 1TB Samsung SSD 840 EVO drives that we’re giving away worth over £300 apiece.
Security for virtualized datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.