Data owners often struggle to ensure their data is not just accurate and secure, but also attracting the right royalty rate from its users. David Reed looks at a new service that aims to provide an end-to-end process covering all of the bases.
If your business captures data through five activities and uses five channels to market, there are twenty-five data streams that need to be managed and co-ordinated. Double that number if you then distribute the data through a couple of strategic partnerships.
Now consider the role of external data sets in validating and enhancing your data at point of entry or cleaning and scrubbing at the point of distribution. Those fifty feeds could easily become one hundred key points at which data enters and exits the business.
With the exception of rare single product and even rarer single channel operations, it is obvious that data management rapidly escalates into a complex process. With a long supply chain across and beyond the enterprise, the challenge of maintaining the completeness, integrity and availability of data is clearly growing.
And it doesn’t end there. Data governance structures and the need to demonstrate compliance mean each of those points of data entry, usage and transfer need to be closely controlled, monitored and audited. Little wonder that the majority of companies have yet to create a closed loop environment that provides them with proper management at every stage.
It is precisely because of this gap that DQM Group has developed its Secure Distribution Hub, a suite of services and solutions that link up these multiple data management stages, apply the appropriate inputs and provide management information on the back. For any business wanting to gain full control and a better picture of how their data is flowing through the supply chain, it looks like a godsend.
“It is an amalgam of services, parts of which are relevant to almost every business and for many, the whole of it will be appropriate,” says Peter Galdies, director of DQM Group. “The sum is greater than the parts because of the synergy between them.”
Senior management is fond of talking about the need for a 360-degree of the business, but is less keen on undertaking the work required to achieve it. In data management, most of the processes found in the Hub are likely to be in use at various stages or by parts of the business. Having a pre-configured, joined-up solution removes much of the pain of trying to gain this holistic insight.
Galdies says that usage of the Hub is often driven by adversity. “A typical scenario is that something is happening, for example the business has commercialised its data and something has gone wrong. That might be that the returns forecast are not being realised,” he says. To understand what is happening, the data owner needs to be able to track how its information has been used, compare that usage against the contract and review pricing and revenues achieved.
Problems do not have to be the only reason why a business might sign up for the Hub. “It could be that the company is taking its data responsibilities seriously. It is in their interests to ensure the supply chain is compliant,” says Galdies.
By providing an end-to-end solution (including database hosting if required), DQM Group is supporting both data governance and business goals. Right from the first step, the Secure Distribution Hub puts best practice and monitoring into place. In the load module where data enters for the first time, records are compared against the expected structure and description - any that do not fit the expected parameters are returned.
“It is a quality filter because if the data doesn’t match what we are expecting, it won’t match what the end user is expecting,” says Galdies. Multiple templates can be set up to reflect different types of data or agreements about how it will be used. The matching is done on the fly to avoid slowing down the rest of the chain.
In the next step, the preference module can apply pre-loaded house “do not contact” lists to filter out non-permissioned records. “This should be the last port of call for internal stop lists to double check records haven’t crept back in,” he notes. Commercial and statutory screening files are then applied, such as Telephone Preference Service (if records are going to be called), or options like Mailing Preference, deceaseds and goneaways.
Galdies notes that screening during the Hub process maximises the time during which data is available. So where TPS screening is used, the required 28-day cycle can be used to its optimum by taking out registered numbers using the Hub, rather than during an earlier, upstream data consolidation process. Clients can choose whether records are suppressed completely, have elements blanked out (such as the telephone number) or a flag applied.
The distribution model comes into play if a data set is being licensed to multiple different end users, each taking a specific cut or profile of the records. The configuration for their version of the file is pre-set in this module to ensure they receive data exactly as they are expecting it.
Seeding is then applied. “If the data is going to 50 different licencees, they get 50 different sets of seeds,” points out Galdies. The templates created for the load module play an important role here because they ensure that seed records appear in the file looking exactly like the original records. This also applies to an specific selections, so if a licencee or user is only taking data for a set geography, only seeds from within that region will appear. The number of seeds added at this stage is also proportionate to the file size.
Secure file transfer protocols are then used to send data out to its licencees. (This part of the service can also be used as a standalone system for transmitting valuable content, such as plans, reports and the like, through a secure channel.)
Third parties may then be reselling data to further users, which is where the back-end auditing services offered by DQM come into play. Through seeding reports, the company is able to compare actual usage of data against the license. This audit trail can be used to generate a royal and use report, allowing the data owner to check they are receiving all the income they are entitled to, or to trigger warnings about mis-use.
A growing number of commercial data owners, such as Royal Mail, have adopted this system to ensure their business is operating and benefiting as planned. “We can even manage the royalty process by taking data, getting the terms and conditions from the contract, and creating a royalty statement for both parties,” says Galdies.
While data mis-use is relatively rare, it is easy for an end user to be applying data outside of licence terms, simply because the point of purchase and the point of use are not usually one and the same. Unless the information is matched at both ends, compliance is hard to demonstrate.
Galdies notes that auditing is becoming recognised as more essential than ever in the world of data management. “For internet-based companies with commercial relationships with third parties, there is a greater realisation that auditing is important, particularly of their partners’ security arrangements,” he says.
By using the new Hub service, a company can generate a record of every record, when it was cleansed, who it was licenced to and if it was used correctly. That is important to ensure data protection compliance, since the data controller maintains ultimate legal liability regardless of the length of the supply chain. With the growing complexity of data management, getting a 360-degree view does not have to remain just a vision, it can be a reality.