I didn’t start out with aspirations to work in privacy, though looking back it’s a natural fit. I’ve always been interested in convergence and privacy represents the convergence of human rights, regulation and technology – all areas that remain very important to me. It’s exciting to be at the forefront of innovation and working to make sure innovation serves to benefit everyone.
I’ve always taken in new and different ideas and have enjoyed my varied journey so far. I’ve veered between disciplines; I studied English Literature at Oxford University but trained as an accountant and took the plunge from Shakespeare to statements because it was important for me to balance my knowledge of arts with an understanding of the numbers. I then worked in accounting, auditing and technology risk before I found my privacy calling.
After two decades in privacy and the private sector, it was time to experience the public sector and look beyond the bottom line. I wanted to be challenged as well as interested and, as the GDPR had just come into force, I searched out the role at the ICO.
No matter how much I thought I was making a difference before, it was always for specific companies or projects. Now the work I do at the ICO is effecting change and having an impact across entire industries and sectors. The themes I’m working on currently – such as adtech, AI and anonymisation – have a global impact. Working for one of the largest, most influential and well-respected data protection regulators in the world at this defining moment for privacy makes me incredibly proud.
I really admire Garry Kasparov. Not only as one of the greatest chess players ever, but for his reaction to his 1997 defeat to Deep Blue. Rather than sulk, he engaged with the technology, and his writing on the differences and synergies between human and machine intelligence is highly insightful.
There were some big overarching themes for the ICO in 2019, including adtech, AI and our Age Appropriate Design Code (AADC). Our concerns about adtech remain the same and have now been validated by the industry. What surprised me was the amount of positive engagement from the sector and that was great to see – I’m optimistic the industry will change.
Also, at the start of 2019, privacy and competition were distant cousins. What started off as a fringe discussion has now become mainstream and it will be interesting to see how these discussions will unfold in 2020.
It must be privacy enhancing technologies. The next generation techniques I was originally familiar with as laboratory-based technologies, such as synthetic data-sets, federated learning and homomorphic encryption, are now being used by commercial firms to protect personal data and still deliver insights. There is huge potential in technology that can undertake analysis without having to access underlying personal data. Later this year the ICO will be publishing guidance on anonymisation and will be working on other privacy enhancing technologies.
I think the biggest challenge is trust. With every privacy scandal people lose trust and become less inclined to share their personal data, more inclined to place restrictions on technology, and more likely to ask for their information to be deleted.
When organisations fail to bring their users and consumers along with innovative changes, and when organisations fail to protect privacy, the trust deficit widens. Regulators like the ICO have been dealing with the implications of this trust deficit – we see it in the complaints we receive, and the breaches reported. Organisations that embed a culture of trust make sure people know how their personal data being used. Building trust in how personal data is used is one of the single most important things organisations can do for their customers right now.