Keeping track of the human in your algorithm
Does your organisation have any of the following goals in its use of data and analytics? A diffusion of knowledge across all levels of staff. Productivity improvements resulting from self-optimisation against group benchmarks. Reduced management intervention, but increased oversight and visibility of processes.
Sounds like the strategy for any data-driven business engaged in the digitisation of its practices, doesn’t it? Now consider this from a different perspective. Constant scoring and critical notes on performance from an unseen supervisor. Every action tracked and measured, including comfort breaks. Fixed processes that do not allow for discretion or human variability.
The first view dates from between 1788 and 1793 and is the description by Jeremy Bentham of the ideal prison. In this panopticon, all prisoners would be visible from a central point occupied by wardens who would not themselves be visible. Conscious that they were under constant scrutiny, the behaviour of those criminals would improve, leading to a reduction in the need to use punishment. As Bentham wrote, it would provide “a new mode of obtaining power of mind over mind, in a quantity hitherto without example.”
The second example stems from a Health and Safety Executive report into working conditions in call centre in 2004. It dubbed these workplaces the new “dark satanic mills” because of their impact on staff, from low job satisfaction through to poor mental health.
Fast forward to 2018 and the extent to which workplace technology has further enabled this level of scrutiny by management of its workforce. HR analytics is a growth area which embraces not just recruitment and employee retention, but can cover technologies such as workplace trackers like those patented by Amazon for use in its warehouses and even keystroke logging.
The risk is that they lead to a culture in which workers are viewed as little more than parts of a machine.
Each of these technologies and uses of data may not seem particular sinister in themselves - workers have always been subject to oversight and measurement, whether it is a shopfloor foreman or call centre agent activity monitoring. Yet as these disparate data sets get sewn together as part of a bigger HR analytics play, the risk is that they lead to a culture in which workers are viewed as little more than parts of a machine - subject them to the right monitoring and impulses and the whole enterprise will become optimally efficient, even if that removes all sense of personal control, job satisfaction and even self-worth.
In case that seems too fanciful, then a conversation I had with a senior analytics practitioner employed in a call centre-based financial services company this week suggests it is not. The prevailing managerial culture there is one of distrust of staff and a conviction that they need to be constantly criticised and punished to keep them in line. So much so that, when the chief executive was due to go on holiiday, he planned to install CCTV to monitor the call centre without notifying staff.
If that shocks you - and it should - then it means you are on the right path towards a sense of data ethics. Increasingly, doing the right thing by employees, customers and other stakeholders, goes beyond checking what the law permits and into whether an action fits with the company’s values and the expectations of those involved with it. Data and analytics are at the heart of this new dimension since they provide the raw materials and tools through which both the right and wrong things can be done.
That’s why DataIQ Leaders is creating a new workshop around governance, regulation, compliance and ethics that will be available to members in 2019. It aligns with initiatives such as the UK’s Centre for Data Ethics and AI, which is intended to head-off any negative impacts that could emerge from this new world.
With digital technology and the data it creates potentially placing us all into a new panopticon, the time has come to define and constrain who gets to sit in the middle and decide what good and bad looks like.