The introduction of the General Data Protection Regulation (GDPR) didn’t just bring more regulatory change for companies to deal with and a lot of noise around issues such as consent and privacy notices - it kick-started in many businesses a long-overdue and neglected focus on data protection risks and practices. It also brought the value of the customer information we are holding to the attention of the executive and the board.
With regulatory, technology and market demands influencing how our data is used, managed and protected, the role of the data protection officer (DPO) as advocate and adviser is becoming increasingly important. Although GDPR placed a regulatory focus on the role - requiring that a DPO be independent or not conflicted, involved and report to the highest levels of the organisation - increasingly the DPO is being drawn into the strategic discussions.
With issues including the transfer of personal data after Brexit and the UK’s status regarding data adequacy, the current debate on data ethics or the introduction of initiatives like open banking affecting ownership, consents and security, the DPO’s remit is being pushed beyond its regulatory focus as envisaged under GDPR. Add to this the business world’s current fascination with the potential opportunities for and risks from artificial intelligence (AI) and machine learning (ML), as well as the debate on data ethics in this environment, then what appeared to be a relatively straight-forward role has become one requiring the owner to be literate in multiple disciplines.
Developing models for governance is difficult and raises questions of data ethics.
In reacting to this, DPOs and the organisations they work for are embracing the requirement for data privacy by design, putting in place tighter processes upfront and legitimising decisions by ensuring the appropriate governance is in place from the beginning. Developing these models for governance is difficult and raises questions for organisations on how they measure areas like data ethics.
How does an organisation’s data ethics relate to its corporate values and how will they be recorded within our governance environment? For DPOs, this can affect decisions regarding our legal basis for processing, eg, is it based on performance of a contract, legitimate interest or consent? Does it reflect our approach to automated processing activities and how does this affect AI and ML initiatives?
In considering governance frameworks and the measurement of factors that include data ethics and the construction of decision models to support these frameworks, the DPO and their peers are faced with questions about culture and how these less clearly-defined measurements affect our governance. In this environment, what was a regulatory-based role focused on compliance is now impacted by the links created between ethics, culture and governance in the modern world and how these affect the DPO’s responsibilities to advocate for the customer and advise the organisation.
In addition, governments, industry bodies and regulatory authorities are investing to understand the effect of this more complicated environment and determine what, if any, additional regulatory oversight will be required. In the UK, one example is the creation of the Centre for Data Ethics and Innovation focused on these questions regarding AI.
These pressures, now highlighted in the changes in the DPO’s role, are causing many boards to consider whether (and how) data protection/privacy needs to move from being fundamentally a compliance-focused data initiative towards being a core business and operating model-focused approach that addresses future customer-facing activities.