It's not the data you have, it's what you do with it
Computacenter's Bill McGloin looks ahead
Companies may “miss out on gaining business advantage because of failure to optimise stored data”, according to a survey conducted on behalf of Computacenter in June 2013.
About 60 per cent of companies find it hard to analyse the information stored on their systems. While they may be able to cope with the volume and velocity of their data, their infrastructure is not optimised to help them find and use information effectively.
Computacenter’s practice leader for data optimisation, Bill McGloin, points at one reason for this difficulty.
“One of the challenges is that customers still have some legacy infrastructure, and they are still holding their data and information in several different places,” he says.
“So what they are doing is almost pilot analytics. They want to see what value they can get out of the data they have. But when people run pilots based on just pockets of the data they hold they can be disappointed with the outcome.
“The customers driving most value from their data are those who have made the effort to consolidate it and are analysing it from several streams.”
However, change is afoot. “We have spent a lot of time over the last few years ensuring that any solution we offer to customers is flexible and scalable to meet both the known future business requirements and the unknown ones," he says.
“Time was when the unknown would have been acquisitions but now we are finding that the unknown is how much value customers want to get out of their data.
“We are starting to see a trend for customers to consolidate all their data. Rather than project by project, the corporate decision is to hold data in a central location so the company can start to drive value from it.”
McGloin warns that organisations that simply store data will find it harder to thrive. He also counsels against keeping the existing infrastructure and then creating an analytics environment where companies can put all their data to analyse it.
“This is an expensive approach because you are doubling up. That is why we are starting to see vendors bringing technology to market that negates that,” he says.
McGloin sees changes in how Computacenter customers want to architect their environments and consolidate their data, while “vendors are starting to bring new solutions to market to enable companies to access their data in any fashion or format”.
He says: “There has been a rise in data warehouse solutions like Oracle Exadata and IBM Netezza that consolidate structured data. We are also seeing a rise of object-based storage, and a rise in vendor activity to develop that type of solution.
“For example, there was the recent release of EMC ViPR software-defined storage, which means the underlying data can be consolidated in a single storage pool and presented in whichever fashion the user wants.”
Computacenter provides an analytics-ready infrastructure for its customers and has partner organisations that will plug in to provide value, but it does not provide analytics expertise and methodology.
Hoarder next door
“We are working with a large number of customers to get their environment ready and we are offering solutions that provide much higher levels of consolidation and optimisation than we have seen before,” says McGloin.
In spite of this consolidation, companies are nevertheless hanging on to all their data, a trend that McGloin thinks is set to continue.
“In 2010 there was a wave of real data growth, at which point people started to have deletion policies,” he says.
“That has now come full circle and people are adopting non-deletion policies. This is partly for compliance reasons but also because companies think they might be able to get value from this data.
People are not getting a huge amount of value from their legacy information
“In fact people are not getting a huge amount of value from their historic legacy information, but they are driving more value from more recent data.
“We are seeing a move towards a two-tier model of data: a performance tier for active data which can be analysed in real time and a capacity tier for longer-term, less critical data.
“The performance tier still comes at a premium, although prices are falling rapidly and capacities are increasing. The capacity tier is more affordable.
"We are also seeing an uptake in disk-archiving solutions, whether on-premise or in the cloud, so we are taking a cloud-based archiving service to market.”
McGloin says there are plenty of business analytics success stories among the enterprises and blue chips that form much of Computacenter’s customer base, though he is reluctant to name names.
“The customers that are successful do fairly wide-scale analytics. We have customers with online mail-order businesses who are running analytics that can track which products are likely to be popular so they can be promoted on the home page or targeted at relevant customers. We also have customers in finance and regional banking who are able to make decisions in real time or close to real time.
“They are getting to market faster with new products, as they can see what people are interested in quicker.”
Bring in the experts
McGloin notes that the skill set is evolving. “At larger customers we will provide the infrastructure and partner with someone who provides the software and sets up the reports, but there will be expertise in-house at our customers,” he says.
“No one has large teams yet, but certainly some of our customers are putting analytics teams in place.”
He thinks the numbers for data growth will continue. “Forty per cent growth in data per annum is a reasonable rate to expect to continue,” he says.
“I think over the next couple of years we will see some changes in technology that will start to make life easier for our customers, like consolidation of data and data sets that are easier to manipulate, but growth in data is not slowing down.
“I also think the current trend to keep everything won’t change, but where people keep it might change.
"As we start to understand the types and age of data that customers are getting the most value from, the legacy data won’t necessarily be used to provide analytic value, but it will continue to be retained.” ®
Sponsored: Hyper-scale data management