Truth and reconciliation: harder in data even than in politics?

David Reed, director of research and editor-in-chief, DataIQ

Countries that have undergone internal conflicts and civil war often find that a process of truth and reconciliation is valuable. By acknowledging the faults and failings of the past in a fully-transparent way, society and government can be restored to a more trusted relationship. Often, this requires a neutral party to step in a mediate between those conflicting positions. From South Africa to Northern Ireland, the impact of this approach is clearly visible.

Data may not suffer from the same degree of bloodshed and fighting. But entrenched positions, blame and finger-pointing are no less evident when it comes to establishing the underlying truth of any given data point. How often have you wasted time in meetings trying to reconcile different sets of numbers, only for the discussion to break down into highly-charged infighting?

It is no surprise, therefore, that the datification of many companies undergoing a digital transformation is often led by a focus on business intelligence. The creation of a “ministry of truth” ensures that business-critical numbers are consistent and aligned is a key first step. Investing in data to remove disruptive discrepancies is essential for any organisation that wants to unlock to value of its data asset.

The way in which this gets done is evolving, not least with the emergence of sophisticated new tools and a diminishing of the “keep everything now, figure it out later” approach to big data. If you have heard mocking references to “data swamps” recently, it is because of a growing understanding that the integrity and credibility of data can not be assumed unless there has been a strong governance and quality check first. 

Yet, as with political settlements, it can take decades before this need for truth and reconciliation gets confronted. Take the world of supply chain management - electronic data interchange (EDI) between manufacturers and retailers predates the internet. There is even a trusted, third-party standard, in the form of GS1 UK, which provides the barcodes you find on everything you purchase and which should, in theory at least, mean all product data is aligned up and down that supply chain.
 
Given that framework, how is it that data analysts in retail organisations still have to spend hours reconciling stock data? Poor governance is part of the answer. If you don’t enforce accuracy on the way in, it will be absent in eventual outputs. When a logistics manager enters “1” in the quantity field for a newly-delivered product, that could mean a single pack, a multi-pack or a pallet unless either the metadata is well-defined, or there are clearly-differentiated codes for each variant.
 
That is why busines intelligence managers are often the shock troops of any data transformation. Sending in the BI enforcers to identify pain points, set standards, police adherence and show better reports in consequence reveals the benefits of good quality data to line of business customers.
 
From there, an organisation can progress towards value-driving activities, such as predictive analytics, just as newly-harmonised countries get to develop their economies (take South African wine as just one example). 
 
Like peace-making politicians, it is that kind of positive outcome that every busines leader wants to achieve. And it only comes the other side of a painful process of admitting past mistakes. Which may be why so many organisations struggle on with the misery of unreconciled data…
 

Please note that blogs are the sole view of the author and that they are not neccesarily the view of IQ ddg Ltd and should not be interpreted as advice. Please read our full disclaimer

Director of research and editor-in-chief, DataIQ
An expert commentator on all things data, David has been editor of DataIQ since its inception in 2011.

You have....



to be GDPR compliant.

Register with us for all the news

Sign-up to hear about the latest DataIQ news, content and events.