See also: Mastering master data management slideshow

It is now a decade since a nascent technology market called master data management (MDM) emerged, aiming to tackle the inconsistency that dogs most large organisations when it comes to handling key shared data such as customer, product and asset.

A survey run by market research firm the Information Difference back in 2008 showed that, on average, a large company has six different systems claiming to be the one and only source of customer data, and nine such systems claiming to be the trusted source of product data.

This lack of consistency in shared data is more than an inconvenience: it makes it very difficult for an organisation to understand its business performance.

Answering basic questions like “what is my most profitable product line?” or “which assets are the costliest to maintain?” proves challenging if there are multiple definitions and codes for products and assets, and if different parts of the business allocate costs differently.

Of course stamping out such issues was what ERP was supposed to do, but quite evidently did not.

One major company I work with admits to having 650 separate major applications, just one of these their ERP system.

Very few global companies have a single ERP instance: dozens of separate instances within a company are common, hundreds not unheard of.

Related:

Given this diversity of operational transaction systems, the idea of MDM is to establish a dedicated hub that can act as a trusted single source for master data.

There are different approaches, but the goal is always to end up with a single authoritative source of data about customers, products, locations, people, assets and the like.

This source can be used to populate other applications, such as a corporate data warehouse, which can in turn rely on this data to be of high quality and to be the definitive version.

A host of technologies sprang up over the last ten years providing technology to address this need, some wider in scope than others, but all with the core intention of fixing the problem of inconsistent and poor quality master data.

The same market research firm conducted a survey in the summer of 2012 to try and update this picture of how MDM has evolved, successful projects implementing it, and the overall cost.

This survey followed on a similar one in 2008, the idea being to see what differences, if any, have emerged.

One immediate difference that appeared was in the longevity of the MDM applications. The average MDM application had been live for four years, whereas in 2008 the average was barely over a year.