Week 9 Summary – Algorithmic Bias

David Beer raises the issue of the decision-making power of algorithms and identifies that there is need to understand how algorithms shape organisation, institutional, commercial and governmental decision making (Beer, 2017). There are criticisms of those holding the view that of algorithms as ‘guarantors of objectivity, authority and efficiency’ and with others arguing that due to the fact algorithms are created by humans, they embed layers of social and political bias into their code, that result in decisions that are neither benign or neutral. Furthermore, these “decisions hold all types of values, many of which openly promote racism, sexism and false notions of meritocracy” (Noble, 2018). As such ‘algorithms produce worlds rather than objectively account for them, and are considered manifestations of power’ (Knox, 2015).

It was this notion of algorithms bias that drove my inquiry in this week’s section of the lifestream blog and there was no shortage of social media commentary on the issue. Cathy O’ Neil identifies this in her YouTube clip and supports the view by claiming that algorithms are not objective and that they are merely ‘opinions embedded into math’. Perhaps most interesting, was the work of Joy Buolamwini, whose investigation  artificial intelligence face recognition software, has unearthed inherent racist and sexist elements from its developers.

To what extent are racism values embedded into algorithms?
Joy Buolamwini’s has carried out extensive research on how algorithmic code determining facial recognition, fails to recognise black women – Click the image for more detail

However, where does this notion of algorithmic bias intersect with education and what type of educational landscape will the algorithms produce? With the rise of anti-plagiarism software, and the growth of intelligent teaching and learning platforms such as Century Tech, many educators fear that there is incremental dependency on algorithms within schools and colleges, particularly for assessment. This is certainly not without difficulties or tension. Ben Williamson claims that many studies have highlighted inaccuracies in the Turnitin software, which many institutions use to cross-check student work, incorrectly branding some students as cheats, whilst missing other, and very clear instances of plagiarism. This ultimately leads to a growing level of distrust between youngsters and their educators, and is responsible for breaking down relationships as the use of technology, and algorithmic dependency increases. How else will students and teachers be negatively impacted by algorithmic biases (or errors) and, as dependency on these tools continues to grow, will educators be able to even identify when this happens, let alone how to mitigate it?


  • Beer, D. (2015) The social power of algorithms, Information, Communication & Society, 20:1, 1 – 13, DOI: 10.1080/1369118X.2016.121614
  • Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1
  • Noble, S. (2018) Algorithms of Oppression, NYU Press, New York.
  • Williamson, B. (2019). Automating mistrust. Code Acts in Education

“Algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions… They’re just automated assumptions. And if you don’t fix the bias, then you are just automating the bias.” https://t.co/dWrDdN1FyL

Liked on YouTube: The Truth About Algorithms | Cathy O’Neil

Some key takeaways from this short clip, that have congruence with the themes of the Algorithmic Cultures block.

Cathy O’ Neil argues that algorithms being presented as objective fact is a lie. She says ‘a much more accurate description of an algorithm is that it’s an opinion embedded in math“.  “There’s always a power element here” she adds, and that “every time we build an algortihms, we curate our data, we define success, we embed our values into algorithms.”


Algorithms of Oppression

“Part of the challenge of understanding algorithmic oppression is to understand the mathematical formulations to drive automated decisions are made by humans being. While we often think in terms such as ‘big data’ and ‘algorithms’ as being benign, neutral, or objective, they are anything but. The people who make these decisions hold all types of values, many of which openly promote racism, sexism and false notions of meritocracy, which is well documented in the studies of Silicon Valley and other tech corridors”

– Noble, S, (2018) Algorithms of Oppression

Article showing congruence with Rob Kithcin’s view that ‘we are entering widespread era of algorithmic governance, where algorithms will play an increasing role in the exercise of power’ (Kitchin, 2017) https://t.co/A2hbTjrAsl via @Technology_NS