With Algorithms of Oppression, Safiya Umoja Noble puts forward a robust and thoroughly researched critique of revenue-generating search engines, while calling for regulation and public policy to better govern information on the internet.
Backed up with stats, citations and screenshots and taking a US-centric view, Noble dissects our trust in and reliance on search engines and examines the way that harm is caused to marginalised groups by the manner in which information is served up.
"73% of users say information found with search engines is accurate and trustworthy."
The fact that search engines like Google are primarily ad companies that are in the business of selling ad space to the highest bidder is underscored. Attention is also drawn to the fact that search engines are being seen and used as neutral, credible, accurate and depoliticised providers of information. According to Pew Research Centre, 73% of American search engine users say that all or most of the information they find as they use search engines is accurate and trustworthy. That is scary.
“Google Search is an advertising platform, not intended to solely serve as a public information resource. Google creates advertising algorithms, not information algorithms,” she argues.
What spurred Noble to research this book was when she was served pornographic adverts and search results when she typed ‘black girls’ into Google back in 2011. As such, she looks at how racism in wider society, which positions black women as sexually deviant, became embedded in the technology platforms we use every day.
There is a short chapter given over to the ease with which an uncritical search engine user can delude themselves into thinking they are conducting objective research on an issue or demographic group while they are actually being fed racist claptrap and misinformation.
The example of this point was the perpetrator of a hate crime and mass shooting at a Black church in South Carolina in 2015 in which nine people were murdered. The killer had published a manifesto in which he claimed to be “completely racially aware,” having read a Wikipedia entry about the shooting of Trayvon Martin and subsequently searching ’black on white crime’.
Noble states that a clear line cannot be drawn between the search results and the murders but his search results did lead him to “narrow, hostile and racist ideas” and with the ranking system, people erroneously believe that the results at the top are more likely to be credible and trustworthy.
She advocates for the ‘Right to be Forgotten’ and references a librarian who questions the ethics of digitising all information, uploading it and making it available on the open web when the information originally belonged to a small community with shared values.
Essentially, this is a question of consent, as often marginalised groups are unable to take part in the decision-making process about the information that pertains to them. For me, this was reminiscent of the notion that big consumer-facing companies are collecting as much data possible about their customers just because they can but need to think about whether they should. I feel the key point here is that informed consent is essential.
There is the examination of the idea that classification and cataloguing, that are part of library and information science, inherited and perpetuate biased systems. This is exemplified with the fact that until the 1970s the Library of Congress Subject Headings included problematic labels such as “Yellow Peril” and “Jewish Question.” Noble also referred to another library practitioner who warns of the increasing, what one might call. ‘Googlification’ of library discovery systems that return lists of results without nuance or context.
She calls for several things including more search engine alternatives as well as public policy that addresses the increasing number of problems posed by unregulated commercial search engines, and a redesign of web indexes.
Helpfully she suggests and sketches out an alternative way of presenting search results that is more transparent than the current ranking system, The Imagine Engine. Looking similar to a colour wheel results are plotted according to where they sit in relation to entertainment, commercial, non-commercial and sex. Crucially, it would allows the user to block out racist, sexist, homophobic or pornographic results.
In her conclusion, she takes the focus away from Google and onto Yelp, a consumer review website of restaurants and businesses. By speaking to a Black hair salon owner in a college town on the east coast, she gets a first-hand account of how the people who have created the technology platform and the weightings they used when programming the algorithm work to her detriment. “I think Yelp looks at people as their clients, not mine. If they are your clients who are loyal to your business and you, they are not interested,” said Kandis.
“Algorithms are and will continue to be contextually relevant and loaded with power,” Noble also wrote in her conclusion adding that this book has opened more lines of inquiry using a black feminist technology studies approach to Internet research.
She also said that public funding and adequate information is needed to protect the rights of fair representation online so the erosion of quality information to inform the public doesn’t continue. She also calls for public policy that advocates protections from the effects of unregulated and unethical artificial intelligence.
This publication is extremely helpful in breaking down the fundamental aspects of search, the inequities that arise from the status quo and what can be done to make the internet fairer for all.