Feeds

Understanding your information assets

What are you are trying to solve?

  • alert
  • submit to reddit

Business security measures using SSL

Lab Information is power, right? Well, just how powerful must most organisations feel today, given the amount of information they are packing? The immediate irony is that the opposite is generally true – we create data with gay abandon, but our ability to keep tabs on everything that we create is showing itself to be increasingly inadequate. So what’s the answer?

We shouldn’t be too harsh on ourselves of course, as many of the problems around data are beyond anyone’s capabilities to have predicted or indeed solved in advance. For a wide variety of reasons lost in the mists of time, many organisations have highly fragmented data stores containing overlapping and sometimes contradictory data.

We could ‘blame’ the silo-ey nature of IT systems, but silos often exist for good reason, not least that it is actually possible to define a set of requirements for a single system. We might have complained about functional specifications being far too complicated and detailed, but at least they will have been simpler than the (theoretical) specs that tried to do everything for everyone.

Traditional procurement and economic models of IT have also played their part, along with stakeholder management, aka: “Who’s going to pay for the thing?” Again, it is far simpler to get funding and agreement for something that does a reduced set of things well, rather than trying to deliver the ultimate answer to everybody’s needs.

As a result we end up with the kinds of IT environments we see today, with lots of systems, each having their own little pockets of data. Packaged applications were supposed to change all that, of course, but IT rarely keeps up as departments re-organise, and companies merge, divest and change. And so we arrive at situations such as one organisation having eight ERP instances, or as one person told me yesterday, 49 separate directories and other stores of identity information.

The challenges are legion. Data quality remains an issue for many organisations, as information stored months or even years ago contains errors of varying forms, from spelling mistakes to storing important data items in the wrong fields because that was the only option at the time. Accessibility and interoperability are burdensome, pushing the Shangri-La single view of the customer forever just out of reach. Compliance with regulations remains nothing short of onerous, and of course, we have the operational overheads of managing, supporting and backing up the stuff.

Even addressing these issues assumes that you know what data exists in your organisation, and who’s using it for what purpose. After all, if you don’t know what you have, how on earth can you do anything to improve its lot? In today’s increasingly distributed environments, fragmentation is only going to get worse, raising the risk of incidents we’ve seen hit the press such as data leakage.

Vendors would no doubt say that they have solutions to all of these problems, from data discovery and cleansing, through high-end extract-transform-load (ETL) and more specific data leakage prevention tools. But such offerings may only help you tread water, improving on the past and getting things a bit more ship-shape. They don’t fix the underlying lack of control, structure and discipline.

The alternative starting point is to worry first about what information you need, before paying too much attention to what you should be doing with it. Information modelling exercises that start with business activities and their supporting information needs do give a better handle on what should be seen as high-priority information – which then can inform exercises to identify what state that information is in, for example.

All kinds of methodology and tooling exist around such identification and mapping exercises. Enterprise architecture frameworks such as TOGAF expand on the different kinds of information, and technologies exist to discover, map and maintain information models that result. Meanwhile, master data management, for example, is about identifying the key information assets of the organisation, and linking the ‘master’ view of such assets to the different places where they are instantiated – again, various tools exist to support such efforts.

The problem with areas like this, is that they can all too easily become part of the problem too. In this industry we are very good at building Towers of Babel, huge edifices that started out trying to solve a problem that was simple to describe, but easy to over-complicate. Mapping of data assets and discovery, along with classification exercises can end up with over-blown descriptions of every single entity used in the business, whether it’s important or not. Much time can be spent trying to optimise such things, slowed down by ‘committee debates’, as different parts of the organisation put their oar in. As one enterprise architect at an oil company once said to me: “We can’t even agree on what a ‘reserve’ is.”

The way we build and manage IT systems is becoming ever more complex, but the idea of solving complexity with even more complexity does not bode well. We will have to look for simpler ways of accessing, treating and managing information.

Part of the answer involves remembering the key question: “What problem are we trying to solve here?” If holding back the ocean (which makes a change from trying to boil it) is not a long-term strategy, a more important target is what constitutes a good-enough set of solutions to meet the higher priority needs of business users.

Or just maybe methodologies, tools and technologies really will one day cut a swathe through all those pesky repositories, and build a comprehensive view of everything that matters to your business, past and present. While this scenario is unlikely, if everything really could be automated to such an extent, there would be little to differentiate organisations that were good at managing information from those that weren’t. To extrapolate further, IT that ‘just worked’ might very quickly become no more than a utility, offering little in the way of competitive advantage. While today’s environments might be complex and fragmented, as the old adage goes, we should be careful what we wish for. ®

Choosing a cloud hosting partner with confidence

More from The Register

next story
New 'Cosmos' browser surfs the net by TXT alone
No data plan? No WiFi? No worries ... except sluggish download speed
'Windows 9' LEAK: Microsoft's playing catchup with Linux
Multiple desktops and live tiles in restored Start button star in new vids
iOS 8 release: WebGL now runs everywhere. Hurrah for 3D graphics!
HTML 5's pretty neat ... when your browser supports it
Mathematica hits the Web
Wolfram embraces the cloud, promies private cloud cut of its number-cruncher
Google extends app refund window to two hours
You now have 120 minutes to finish that game instead of 15
Intel: Hey, enterprises, drop everything and DO HADOOP
Big Data analytics projected to run on more servers than any other app
Mozilla shutters Labs, tells nobody it's been dead for five months
Staffer's blog reveals all as projects languish on GitHub
SUSE Linux owner Attachmate gobbled by Micro Focus for $2.3bn
Merger will lead to mainframe and COBOL powerhouse
iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
Not fit for purpose on day of launch, says Cupertino
prev story

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.