Big data has been telling a big lie - that the data it uses is not personal and therefore it does not need to operate under the requirements of data protection laws. I have lost count of the number of presentations I have seen by big data practitioners - major brands, data owners, analytics services providers - talking about the behavioural, location and device data they capture, fuse and deploy against service provision or marketing.
On their own, each of those pieces of data is legitimately non-PII. There is nothing about clickstream or device type that is very revealing in itself. As long as that information was being used for simple purposes, such as tracking user experience or monitoring mobile website availability, it was possible to argue that there was no need to apply the requirements for consent, notification, access and security which are needed when dealing with name, address and contact information.
But big data practices have changed all of that. In scouring the enterprise for data sources to exploit, practitioners have created vast data reservoirs into which these feeds flow. Analytical process are then applied which not only identify patterns, they find people, too. The output may not necessarily be directly matched to the customer database - although it often is - but it can result in data assemblies which leave little doubt about the identity of the data subject. If you have hundreds (even thousands) of data items and variables, you have pretty much defined the person, even without naming them.
Certainly, that is what the most influential data protection group in Europe, the Article 29 Working Party, thinks is happening. “Device fingerprinting” pulls together persistent data items, such as device, browser, operating system, to create an identity for that particular access route. Which is why it has just issued an opinion that, where such data items are being collected, it must be done under the same consent terms as when placing a cookie.
Putting privacy statements and opt-in boxes upfront in the data collection process is usually viewed with fear and suspicion by commercial data owners and services providers (although far less so by brands which have strong, positive engagement with their customers). One reason is the worry that asking for consent lets the data cat out of the analytical bag by revealing just what companies want to do with data.
Consumers have increasingly being showing signs of push-back, either by demanding more transparency and control or by showing a clear preference for companies which offer those over ones which are more opaque in their data strategy. There is growing evidence of a clear split in terms of these business models - you can not be on both sides of that particular data protection fence.
Make no mistake, the direction of regulatory travel is in line with what A29WP has to say. Do not be fooled by the fact that what it just published is called an “opinion”. When it comes to writing new laws, what this group says often has a direct line into the statute books.
So if your data strategy is based on the assumption that big data is both freely available and not subject to consent, it is time to think again. For consumers and law-makers alike, big data is getting very personal.
Thank you for your input
Thank you for your feedback
DataIQ is a trading name of IQ Data Group Limited
10 York Road, London, SE1 7ND
Phone: +44 020 3821 5665
Registered in England: 9900834
Copyright © IQ Data Group Limited 2024