It always astounds me that, in this digital age, there are far too many organisations that still do not understand or appreciate the importance of data quality. For a manufacturer, it would be inconceivable to release a product to market without extensive quality control. But this is, in effect, exactly what many organisations that are relying on their data are doing.
Duplication is a major issue, through such factors as customers re-registering because they have forgotten their passwords. The risk here is that you are marketing to a customer who has already purchased that exact product – not only a waste of time and money for you, but this will lead to customers believing you are incompetent and deter them from purchasing again.
It is likely that they will question the quality of your products. They are also highly unlikely to recommend your organisation to their peers. And don’t get me started on the likelihood of mailing to countless Mickey Mouses and other bogus names!
The importance of having an ongoing data quality process that is continually maintained cannot be underestimated. It is not enough to build or purchase a dataset and expect to expand and utilise that data ad-infinitum without investing in the maintenance of it. After all, if a vehicle is not regularly maintained, it will eventually break down and leave you stranded.
Data works in a similar way - all appears brand new and sparkly at first, but can quickly get dirty if not looked after. Data quality involves ensuring the accuracy, timeliness, completeness and consistency of data used by an organisation while also making sure that everyone who uses the data has a common understanding of what the data represents.
It should not be a case of firefighting, either. Sadly, many organisations do not even realise that they have poor data because they have not invested in that area. Only when a problem arises do they understand. They may receive a complaint from a customer who has been sent the same marketing correspondence, but addressed to several different spellings of their name. Obviously, this is a bad experience for the customer and is evidence of duplication in the data.
It may be just the tip of the iceberg in terms of the overall data quality. But it could be too late and may have caused irreparable damage to their finances and brand image. An attempt to get back on track would then require a significant amount of work and money, which could have been avoided by fixing the problem at source, ie, proper cleansing and maintenance of the data from the outset.
Whether it is a case of insufficient financial commitment, a lack of understanding, or sheer apathy, I am not too sure. But there is really no excuse. There are now many resources that can be utilised, such as commercial software that can cleanse, maintain and analyse your data, or consultants, such as myself, who are passionate about data and keen to evangelise its importance.
Wasted manpower and postage, incorrect invoicing, misplaced shipments, customer dissatisfaction, legal compliance, strategy, security. Poor data can have a negative impact in so many areas, not least the bottom line - the one area the powers-that-be think they are protecting by economising on the analysis and maintenance of their data. After all, data is a valuable business asset and should be treated as such.
(David Harvey was formerly a data quality analyst at Microsoft)