Master data management became widely noticed around 2004, though there were a few early adopters prior to that. So, almost a decade on, how has the implementation of MDM progressed?
In 2008 The Information Difference, which in full disclosure is my organisation, ran a survey on MDM adoption, and has recently completed a repeat of that survey, bringing the picture up to date. Because the survey deliberately asked the same questions, it is an opportunity to see how things have progressed in the last five years, in which time MDM has become mainstream.
The first intriguing thing is that the core reason why MDM got started in the first place, the multiplicity of competing sources of master data, has not changed. In the 2008 the median survey respondent had 15 competing sources of master data – in the 2013 survey the average was also 15. This suggests that the underlying transaction system landscape is as complex as it ever was. Indeed the larger companies in the survey had hundreds of sources of master data. The most common reason given for initiating an MDM project was the same as in 2008: “To be able to consolidate master data from multiple disparate systems.” The need is demonstrated in the answer to a survey question asking about data warehousing: in the 2013 survey just 18% reckoned that they had a single data warehouse for their enterprise, which is actually down from 23% in 2008, so it is clear that there is still a very diverse data landscape out there.
In terms of how mainstream MDM is, in the new survey 23% of respondents indicated that MDM was now a well-established and on-going activity in their organisation, almost double the 13% of 2008, though suggesting there is still plenty of scope for expansion. Also, just 5% of respondents admitted to abandoning their MDM initiative, better than the 8% in 2008. This presumably reflects greater experience and improved technology in the intervening period. Indeed 60% of those responding rated their MDM project as “successful” or better, with just 9% “unsuccessful” or worse.
The average cost of an MDM project was $3 million in 2013, down from $5 million in 2008, suggesting that companies are getting better at implementing such projects, or possibly reducing their scope. However 56% wished for a unified master data platform (a touch down from 59% in 2008), which suggests that that there is still a desire for a single over-arching approach to master data rather than separate solutions optimised for different data domains such as product or customer. In this survey not one of the 108 respondents claimed to have a single source for customer data: the median number of competing sources was 10, actually up from six in the 2008 survey. This picture was similarly diverse for product data, with a median of six competing sources, down a little from nine in 2008.
One trend that I have observed recently is the increasing acceptance that data quality is an inherent part of an MDM project. In the 2013 survey, 21% of organisation rated their data quality as “high” or better, up from 15% in 2008; most rated it “fair”. Also, the number of organisations with no data quality technology at all has fallen from a third to 15%. This suggests that some progress at least is being made in the uphill battle with data quality in large organisations. The proportion recognising that poor data quality is costing them at least $1 million annually has also doubled in the five-year period. However, even today fully one third of the survey respondents are not measuring the cost of poor master data. Some 14% of respondents believed that bad data was costing them more than $10 million annually, up from 10% in 2008.
One clear trend in the survey is the greater accepting of packaged solutions to MDM rather than building in-house solutions. In 2008 as many as 18% were building custom MDM applications in-house, which dropped in 2013 to 10%.
The overall picture painted by the two surveys is of an industry that is maturing, but where there is still a long way to go. Too few companies are measuring the costs of poor data quality, and so missing a trick to justify investing in measures to improve it. Still less than a quarter regard MDM as “well established” within their organisations. However, it is encouraging that the failure rate has dropped, and that most companies now have at least some data quality technology. The MDM market is still growing rapidly, a racy 24% in 2012 according to the Information Difference figures. Even in 2013 I have observed a couple of brand new entrants to the MDM software space, a decade after it started to become widely recognised. It will be interesting to see what the results of the 2018 MDM survey will be.