by Angela Guess
Andy Hayler has written an article for Computer Weekly regarding the importance of solid data quality to any Master Data Management effort. He writes, "We are all familiar with examples of the poor data quality that pervades large organisations. How many misspelt versions of your name appear on letters and bills sent to you, for example? There are several underlying reasons for such issues. Firstly, there are basic issues around the quality of data captured by companies. Human beings respond to incentives and the ones doing data entry are not the highest paid people in an organisation. If they are in sales, they care about getting your credit card details right – because otherwise they won’t get paid commission – but other information about you may be less carefully attended to."
He continues, "Once data is captured, though, a new set of issues starts to creep in. Data gets out of date quite quickly: in the US, 15% of people move address annually according to the US Census Bureau (in the UK it is about 11%). How confident are you that all the companies and government departments that you interact with are racing to update that personal data of yours? However, at this stage we have been talking pure data quality – is that address record right or wrong? There is a more insidious problem in large companies and that is data mastering. According to a 2008 survey by my firm, Information Difference, the average large company has six different systems holding supposedly 'master' data about customer, nine in the case of product data, and 13% of survey respondents had over 100 such sources."
photo credit: AMagill