With Algorithms of Oppression, Safiya Umoja Noble puts forward a robust and thoroughly researched critique of revenue-generating search engines, while calling for regulation and public policy to better govern information on the internet.
Backed up with stats, citations and screenshots and taking a US-centric view, Noble dissects our trust in and reliance on search engines and examines the way that harm is caused to marginalised groups by the manner in which information is served up.
"73% of users say information found with search engines is accurate and trustworthy."
The fact that search engines like Google are primarily ad companies that are in the business of selling ad space to the highest bidder is underscored. Attention is also drawn to the fact that search engines are being seen and used as neutral, credible, accurate and depoliticised providers of information. According to Pew Research Centre, 73% of American search engine users say that all or most of the information they find as they use search engines is accurate and trustworthy. That is scary.
“Google Search is an advertising platform, not intended to solely serve as a public information resource. Google creates advertising algorithms, not information algorithms,” she argues.
What spurred Noble to research this book was when she was served pornographic adverts and search results when she typed ‘black girls’ into Google back in 2011. As such, she looks at how racism in wider society, which positions black women as sexually deviant, became embedded in the technology platforms we use every day.
There is a short chapter given over to the ease with which an uncritical search engine user can delude themselves into thinking they are conducting objective research on an issue or demographic group while they are actually being fed racist claptrap and misinformation.
The example of this point was the perpetrator of a hate crime and mass shooting at a Black church in South Carolina in 2015 in which nine people were murdered. The killer had published a manifesto in which he claimed to be “completely racially aware,” having read a Wikipedia entry about the shooting of Trayvon Martin and subsequently searching ’black on white crime’.
Noble states that a clear line cannot be drawn between the search results and the murders but his search results did lead him to “narrow, hostile and racist ideas” and with the ranking system, people erroneously believe that the results at the top are more likely to be credible and trustworthy.
Thank you for your input
Thank you for your feedback
DataIQ is a trading name of IQ Data Group Limited
10 York Road, London, SE1 7ND
Phone: +44 020 3821 5665
Registered in England: 9900834
Copyright © IQ Data Group Limited 2024