Big Data about to bottom out, says Gartner
Users can't get no 'satisfiction', but they try, but they try, but they try ...
Big Data hype has peaked and adopters are about to enter Gartner’s dread trough of disillusionment, says one of the firm’s analysts, Svetlana Sicular.
Hype about Big Data is certainly prevalent: here at Vulture South the term is often thrown around by vendors who in past years were content to describe their data-crunching products as offering ‘business intelligence’ or ‘business analytics’ tools. Big Data has even been suggested to us as applicable to small business, despite such organisations seldom possessing the infrastructure or expertise to put it to work.
Sicular says even enthusiasm for the tool most-often associated with Big Data – Hadoop – is waning, as despite good efforts they “… do not realize that they are ahead of others and think that someone else is successful while they are struggling.”
Lots are struggling, she says, because “they are disappointed with a difficulty of figuring out reliable solutions.” Sentiment analysis, a customer-mood-detecting technique often touted as a way to monetise user-generated content, is proving tough as vendors are yet to meet users needs.
“Difficulties are also abundant when organizations work on new ideas,” Sicular writes, especially when organisations try to link unstructured data sources.
Sicular offers the following as one case study of a Big Data user beginning to feel disillusioned:
“Several days ago, a financial industry client told me that framing a right question to express a game-changing idea is extremely challenging: first, selecting a question from multiple candidates; second, breaking it down to many sub-questions; and, third, answering even one of them reliably.”
We've been waiting for an excuse
to use this photo for a while now
“Formulating a right question is always hard, but with big data, it is an order of magnitude harder, because you are blazing the trail (not grazing on the green field),” she adds.
Another reason for a loss of confidence in Big Data is that it does not deal in absolute but instead produces what Sicular describes as “a proof of your hypothesis with a certain degree of confidence” rather than a concrete answer. That kind of result means users need to be satisfied with what she calls “satisficing” solutions, “the first solution that appears good enough”.
Sicular therefore expects Big Data will get bad press for a while. Starting …. now! ®
I believe it
Only yesterday morning, the Today programme on Radio 4 ran a 'big data —the next big thing!' style piece. If it's finally made it onto the radar screens of the BBC's fuckwitted technology correspondents, it's not merely peaked, it's dead and buried.
Searching for diamonds
If you don't know what a diamond even looks like to begin with, sifting through a bigger bucket of shit won't help you find one. And if you do know, you also know that's not where you should be looking anyway.
We don't need no f*g schema!
... that is all
@Anon 16 Re: Searching for diamonds
Clearly you don't grok Hadoop.
If you did you'd understand that your M/R is Java code. Or it could be streaming C/C++ code.
There are plenty of use cases that prove you wrong.
Hadoop is a parallel framework. Pretty basic in concept.
Its a wonder that any recruiter called on your.
The fail is for you.
Another reason for a loss of confidence in Big Data is that it does not deal in absolute but instead produces what Sicular describes as “a proof of your hypothesis with a certain degree of confidence” rather than a concrete answer.
Obviously, since that's what every discovery process does. Even tautological ones (those that only involve the manipulation of formal abstractions, ie mathematics) are only "proofs" under axiomatic assumptions about such things as the proper functioning of the reasoning mind.
So, welcome to Bayesian reasoning, Gartner. Glad to see you could make it.
 This is Descartes' "evil genius" argument: you can't prove that there isn't some "evil genius" with the capacity to force you to believe erroneously that some construction is logically valid. These days, neurobiologists are pretty close to constructing real tools to achieve that, between pharmacological agents and EM manipulation of CNS processes, coupled with the use of functional MRI to determine when and where to apply the tech. Experimenters have already shown they can erase (or render inaccessible) a subject's memory of a specific event, for example.