Week 9 was the second week of our algorithmic culture block and week of our algorithmic play artefact, located here. In the artefact I discuss how Instagram makes suggestions for me based on my search history, my demographic, and my previous likes.
It was also the week we had to leave work and set up home offices; as IT support staff that meant 12 hr days for myself and colleagues to get through the calls for support from faculty and staff in all aspects of online teaching. As my classmate Sean has mentioned though, the global situation is a game changer for our sector, leading to huge opportunities.
In a roundup of this week’s blog additions, I pulled in some posts related to my Instagram artefact. As the posts on ‘the AI generator of fictitious faces’- the post shows how easy it has become for AI to generate an influencer type photograph that will get lots of instagram ‘likes’, based on social norms for ‘what is considered attractive’ – 10,000 fake faces were created by the algorithm over several hrs.
In addition the posts on Instagram’s use of AI here and here, demonstrate how technology can no longer be seen as the passive instrument of humans, but it is being used as an agent by humans and non humans to “nudge towards the predetermined path as designed by the algorithm” Knox et al 2020.
Seeing as there is an “increased entanglement of agencies in the production of knowledge and culture” (Knox 2015), it is very hard to drill down to see who is managing or benefiting from the behavioural governance of these algorithms. This is particularly pertinent to education and the influencing factors on young people. From a privacy or discriminatory point of view, Educational transcripts, unlike credit reports or juvenile court records, are currently considered fair game for gatekeepers like colleges and employers, as noted in this post, and information from these can be used to implement unfair bias against potential scholarship entrants or employees.
Algorithmic code is influenced by “political agendas, entrepreneurial ambitions, philanthropic goals and professional knowledge to create new ways of understanding, imagining and intervening” in our decision making (Williamson 2017)
Therefore we should maintain a critical perspective on algorithmic culture and continue to question the “objectivity and authority assumed of algorithms” (Knox 2015), and how they are shaping, categorising and privileging knowledge places and people. I’m glad to see the that there is a growing awareness of algorithms that appear to discriminate against people based on age, race, religion, gender, sexual orientation or citizenship status, as noted in the development of an ‘accountability bill’ in the US. Culpability can be difficult to prove in these cases due to the way that companies remain very secretive about how their models work, but it is a start.
References for the readings
Kitchin, R. (2017) Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1
Knox J., Williamson B., & Bayne S., (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251
Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice