Feeds

Some first-wave big data projects 'written down' says Deloitte

Not enough data a problem for some, while Hadoop integration has proved tricky

Internet Security Threat Report 2014

Consultancy outfit Deloitte reckons early big data projects have had to be written down because they failed, thanks in part to a “buy it and the benefits will come” mentality.

The source of failure was sometimes difficulty making open source software work and/or integrate with other systems, Deloitte Australia's technology consulting partner Tim Nugent told The Reg. Such failures weren't because the software was of poor quality. Instead, organisations weren't able to make it do meaningful work because they lacked the skills to do so. Integrating big data tools with other systems also proved difficult.

The attempt to develop those skills while also staying abreast of the many changes in the field of big data proved hard for some, Nugent said. Happily, vendors and services providers have since come up to speed and are making it easier for organisations to adopt the likes of Hadoop so they can get big data's enablers working.

Regulation has also made big data projects tough, with Nugent saying “organisations that have pushed the boundaries on personal data use and retention have experienced difficulties in responding to regulatory forces and government scrutiny.”

The business end of big data has also struggled, Nugent said, because those who hoped to consume insights from big data didn't know what they were looking for, or how to turn output into action. Many organisations also assumed that the data they had collected would be sufficient to produce the promised searing insights.

“The orientation now is about bringing data into the businesses,” Nugent said. “In the past it was focussed on integration and consolidation.”

Nugent's opinions are based on observations gleaned in Australia, so perhaps the experience is not universal. If it is, or is in the same postcode as universal, it's bad news for big data advocates because what Nugent has described runs counter to most big data hype, namely that insights are on waiting to be disinterred from the data you already have and will quickly be impactful.

Another of Nugent's insights offer happier news: IT teams, he says, have stood up to be counted and are working happily alongside the business people who want to wield big data. And big data-centric tools like scale-out storage have worked as advertised. ®

Secure remote control for conventional and virtual desktops

More from The Register

next story
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
You stupid BRICK! PCs running Avast AV can't handle Windows fixes
Fix issued, fingers pointed, forums in flames
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Facebook, working on Facebook at Work, works on Facebook. At Work
You don't want your cat or drunk pics at the office
Soz, web devs: Google snatches its Wallet off the table
Killing off web service in 3 months... but app-happy bonkers are fine
First in line to order a Nexus 6? AT&T has a BRICK for you
Black Screen of Death plagues early Google-mobe batch
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
10 ways wire data helps conquer IT complexity
IT teams can automatically detect problems across the IT environment, spot data theft, select unique pieces of transaction payloads to send to a data source, and more.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?