Having just published my artefact Algorithmic systems: entanglements of human-machinic relations, I have started commenting on others’ artefacts, finding insightful (but sobering) explorations of the algorithmic systems of Facebook, Netflix and others. Facebook’s foray into ‘personalized learning platforms’ and the resistance against it is very relevant as I reflect on the issues of surveillance and automation in education (Williamson 2019).
While focusing on Coursera for my artefact, I observed inclusions/exclusions, tendencies to privilege “tech” courses from Western institutions, and experienced the complexities of researching algorithmic systems (Kitchin 2017). Does Silicon Valley-based Coursera – with its vision of life transformation for all – subscribe to Silicon Valley’s notion of ‘progress’ while blind to its exclusions? Is it another example of an attempt to ‘revolutionise’ education, as Williamson (2017) details, framed by neoliberal and commercial agendas yet presented as an objective and universal ‘learning experience’?
I reflect on the ‘black box’ metaphor, yet ‘algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them’ (Seaver 2013: 10) and we ‘need to examine the work that is done by those modelling and coding [them]’ (Beer 2017: 10). Thus, rather than stripping individual algorithms of their wider social and political context, we should ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25) and examine the complex ‘human-machinic cognitive relations’ (Amoore 2019: 7), ‘entanglement of agencies’ (Knox 2015) and the implications for education in an era of ‘datafication’ (Knox et al. 2020).