I quite like the title of Cathy O’Neil’s book about the dangers of algorithm-creep. "Weapons of Math Destruction" sounds like the things Dubya would be fruitlessly searching for in Iraq if he had a lisp back in 2003.
The subtitle of O’Neil’s book is, "How big data increases inequality and threatens democracy." These claims sound somewhat far-fetched, but the author makes a very solid case for both. She does this effectively by breaking down concepts and introducing human protagonists whose lives are, in one way or another, affected by a WMD.
In chapter one, O’Neil begins by explaining what a WMD isn’t. She describes both statistical modelling in sports, as famously used by Billy Beane of Moneyball fame, and her own formula for deciding what to cook for her family each day as “healthy” and transparent models that are unlikely to scale. Therefore, they are completely benign.
These stand in stark contrast to the mathematical models that assess how risky a driver is based on their credit score - with low scorers paying the highest premiums - or target vulnerable people on low incomes with adverts for expensive loans.
Harmful mathematical models have three components; opacity, scale and damage.
These models are defined by O’Neil as a WMDs - harmful mathematical models that have three components; opacity, scale and damage. In the introduction, O’Neil presents a fifth-grade teacher at a Washington DC school who got an inexplicably low score on an evaluation and was unceremoniously fired. A model was employed to rid the school system of ineffective teachers and gave poor scores to teachers whose students showed little improvement over an academic year.
The teacher discovered that her students had entered her class that year with artificially high scores and so it looked as if her efforts made little impact. But she still lost her job. The WMD’s word was final.
Another flesh and bone example was that of a Chicago man who received a knock on the door from the police to warn him not to commit any crime as he was being watched. A crime prediction algorithm had identified him as a person likely to commit crime in the near future. He had been judged on account of his location, socio-economic status as well as the criminal records of his friends and neighbours. He was being judged because of who he was, not what he had done, or hadn’t done, to be precise.
O’Neil detailed one algorithm that predicted recidivism - or likelihood of re-offending - called LSI-R. It used information such as first contact with the police to make that prediction. However, O’Neil pointed to the flawed nature of this input as young Black and Latino males made up 4.7% of the population of New York City in 2013, but accounted for 40% of stop-and-frisk searches. With this type of bias inherent in the data, LSI-R would naturally make flawed predictions.
A mean-spirited manager or a company with punitive policies could inflict just as much harm.
One so-called WMD that I thought was a stretch involved scheduling software used to create the timetables of low-wage employees in the services and hospitality sectors. The author stated that erratic schedules that require employees to essentially be on call at short notice, leaving them with little time to study or arrange childcare, entrench inequality. People with such jobs have little time or energy left to further their skills. However, I would argue that a mean-spirited manager or a company with punitive policies could inflict just as much harm without the scheduling software.
An especially pertinent WMD highlighted by O’Neil was that of micro-targeting by advertisers. She demonstrated how the strategy was employed by rapacious for-profit higher education institutions to entice susceptible prospective students and exploit them for government-backed student loans. She also references Cambridge Analytica in Chapter 10. The Guardian had reported in late 2015 that the firm had paid academics in the UK to amass the Facebook profiles of US voters. We know now the scale of that amassment.
As a UK reader, I felt empathetic for many of the victims of rogue algorithms, but near the start of the book, I couldn’t see exact parallels in the British Isles. But O’Neil included a reference to a crime-predicting algorithm used by Kent Police and other facts to show that she took a larger world view of the issue.
In the conclusion, she illustrated how targeting algorithms can be used to benefit society. She attended a hackathon to help reduce homelessness in New York and referenced a data scientist who identifies possible instances of slavery in the supply chain of goods.
O’Neil also calls for a Hippocratic Oath among data scientists to do no harm, a more vigilant press, stronger government regulations and new laws so that algorithms and mathematical models work for the good of humanity at large. An informative and engaging read, Weapons of Math Destruction exposes the underbelly of black box decision-making and the human victims it leaves in its wake. Thankfully, there is also hope and guidance on how the nefarious nature of WMDs can be rectified.