Feeds

Mixed messages for master data management

IBM conference breakdown

Build a business case: developing custom apps

Opinion IBM's first European user conference on MDM (master data management) in Barcelona was certainly interesting, but I felt that the results were mixed.

For example, some users clearly felt the conference was useful, while others were disappointed. One investment bank I talked to felt the content was insufficiently technical and, in particular, that it did not address either scalability issues or the complex, heterogeneous environments which most companies have to manage. Still, I suppose you can't please everybody all of the time.

Another criticism I heard from several attendees was that the Gartner analyst who was supposed to set a framework for the conference was speaking at the end of the first day rather than the beginning.

I had rather more serious reservations about this particular presentation as I disagree with his segmentation of the market, and did not feel that a sufficient distinction was made between MDM on the one hand and instances of MDM such as CDI (customer data integration), GSM (global supplier management) and PIM (product information management) on the other.

I will start with this second issue. To me, CDI, PIM and the like are subsets of master data management (MDM). MDM is, essentially, non domain-specific. There are two reasons why this is important. The first is that if vendors, users, pundits et al continue to treat CDI, GSM and similar applications as separate and distinct then we will end up with exactly the same siloed applications that have been the bane of IT for the last two decades.

Suppose, for example, that you want to implement a master data management solution for SLAs (service level agreements). That would involve a customer, a supplier, various services, a contract and metrics. In other words, such an application would span multiple master data domains, some of which are about people (customers, suppliers) and some of which are about things (services, the contract itself, metrics).

If we make the assumption that existing CDI solutions can be easily extended to support suppliers and that PIM solutions can encompass contracts and services (there seems to be a consensus that PIM and CDI require significantly different capabilities) then what we don't want to have to do is to build an application that has to span two different systems, putting us straight back into the EAI (enterprise application integration) stuff that we'd rather avoid.

So, given that "people" and "things" are distinct, what is needed is a common set of services that support as much as possible of these in a single platform, with only a very specific and minimised set of differentiated capabilities on top of the platform that relate to CDI, PIM or whatever. This platform then becomes the MDM base upon which application instances are implemented.

As far as how you break down the market is concerned, I felt that the model presented was too simplistic. In our view, there are three approaches that you might take: a registry, which is where you have a reference mapping to the various participating applications; a repository, where you extend the registry by including a mapping to the elements that make up the "best record"; and a hub, where the best record is actually held within the hub. This contrasts with the model that was presented, which only differentiated between hubs (albeit with sub-classes) and registries. A more detailed discussion of our approach is included in Harriet Fryman's recently published MDM Report.

The speaker at the conference spent very little time discussing registries, but the importance of registries and repositories is that they are much less disruptive (and less expensive) to implement. For example, Purisma claims a typical implementation time of just four to six weeks compared to the months, or even years, that you might take over a hub. The scalability issues referred to above also go away.

It is our belief that what users would really like is a way to start with a registry (or repository) and the ability to migrate, progressively, to more complex implementations.

Copyright © 2006, IT-Analysis.com

Gartner critical capabilities for enterprise endpoint backup

More from The Register

next story
Why has the web gone to hell? Market chaos and HUMAN NATURE
Tim Berners-Lee isn't happy, but we should be
Microsoft boots 1,500 dodgy apps from the Windows Store
DEVELOPERS! DEVELOPERS! DEVELOPERS! Naughty, misleading developers!
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
'Stop dissing Google or quit': OK, I quit, says Code Club co-founder
And now a message from our sponsors: 'STFU or else'
Apple promises to lift Curse of the Drained iPhone 5 Battery
Have you tried turning it off and...? Never mind, here's a replacement
Uber, Lyft and cutting corners: The true face of the Sharing Economy
Casual labour and tired ideas = not really web-tastic
Linux turns 23 and Linus Torvalds celebrates as only he can
No, not with swearing, but by controlling the release cycle
prev story

Whitepapers

Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.