Symmetry is often considered to be the basis of beauty with most people preferring the looks of others with balanced faces. Symmetry in data is rarer and is also problematic - sometimes even toxic - especially when it comes in the form of a bell curve.
Google just found this out the hard way when it sacked an employee who argued that paying women the same as men was an act of “political correctness” because it went against the natural distribution of female skills and propensities. In the world of technology and “brogrammers”, the now notorious internal memo claimed, women did not thrive because of inherent female traits, rather than barriers created by culture, education, opportunity and the like.
Support for this argument is usually drawn from any standard set of metrics, such as the formalised educational tests which American schools and universities carry out, which typically produce a symmetrical spread of results around a mid-point. As the same test has been applied to all candidates, it is argued, the results must reveal an objective view of their abilities.
This was most controversially argued in the 1994 book, “The Bell Curve - Intelligence and class structure in American Life”. It used these academic statistics to argue that those in the lower half of the bell curve were there because of racial characteristics. By coincidence, I encountered a 1985 pamplet on African-American educational performance in a charity bookshop in Colchester just two weeks ago which included the statistics on which this book was based. At the time, I was shocked to find an example of racist propaganda on open sale, little expecting that the same thought processes would suddenly emerge in relation to gender diversity at the search engine company.
What is problematic about this approach to educational attainment (or career progression, salary levels or even the outcomes of surgery across different cohorts) is that they exclude all other factors which might have an influence. Elsewhere this week, The University of Manchester published a study showing that people living in the North of the UK are 20% more likely to die before the age of 75 than those in the South.
If you apply the logic of psychologist Richard J. Herrnstein and political scientist Charles Murray, this mortality differential is a consequence of being Northern, rather than decades (or even centuries) of variations in environmental factors (such as pollution), employment type, health service investment, educational and career opportunities, and so on. As such, it will continue to be passed down through families and will not respond to any attempts by Government to shift the trend.
So why does this matter in the context of what Google and its ex-employee just went through? Because of the data it - and all the other major platforms - collect on our behaviour and which will increasingly be used to create artificial intelligence solutions. Without the sophisticated interpretation of raw data, such as data scientists can provide, the real risk is of racist robots. They might prove harder to sack than one unhappy employee.