As I draw to the end of Education and Digital Cultures, there are a number of issues I would like to reflect upon to close the blog. As a nascent student of digital education, algorithms have been a key player in the development of my own knowledge and understanding, with them ‘sorting, filtering, searching, prioritising, recommending, deciding and so on’ as the course has progressed. As David Beer states, an algorithm provides us the opportunity to ‘to shape our knowledge and produce outcomes’ (Beer, 2017, 2) This has certainly been true throughout the EDC module, and there is there is no doubt they have played a vital role in sculpting my understanding of digital cultures within an educational setting, and cultivating my success in the completion of this lifestream.
Despite much of the evidence from the core reading that ‘algorithms produce worlds, rather than objectively account for them’ and that they are ‘manifestations of power’ (Knox, 2015), I would still hold the view that much of the algorithmic governance, in the context of my EDC learning, has been fairly innocuous in nature. Perhaps others would argue this position is naïve, but generally, I am confident that the algorithms throughout the lifestream have always steered my learning in positive directions, offering sensible and useful links to capture my interest, and further learning. This was mostly frequently noted use within my use of YouTube, whereby recommendations normally had congruence with the prior clip, that I had watched or searched for. Whilst my algorithmic play noted the problematic nature of this within other settings, and how this could entrench users in a negative cycle of confirmation bias, within an educational setting there are real benefits to this for the potential it has in furthering learning. In short, I have not felt undertones of subliminal messaging encoded into algorithmic suggestions throughout the duration of this course. That said, I do not deny the existence of algorithmic power, and the manipulative qualities they possess. Indeed, ‘algorithms …are the new power brokers in society’ (Diakopoulos, 2013 cited in Kitchin, 2017). That cannot be denied.
Finally, I wonder where algorithmic governance leaves education, particularly high school children, many of whom are happy to mindlessly watch clip after clip on YouTube, or click on every link or suggestion within their social media? I wonder how much this impacts on their ability to harness enquiry skills or ask valid questions, and steer the direction of their own learning. Do the algorithms exert more influence on their learning pathway than their own processes of enquiry and logical thinking? Are the algorithms encouraging students to think less, and follow more? Is this further contributing to a spoon-feeding, instant gratification culture, that appears to be growing in younger generations? Furthermore, if the algorithms lead students down an incongruous route, how much time would be wasted in watching superfluous clips or heading up ‘digital blind alleys’, before a student is able to realign with the task in hand? Perhaps, with this in mind, it is incumbent upon the educator to ensure that use of this media is mitigated or digital tasks are directed more so by the teacher, than that of an algorithm.
Beer, D. (2017) The social power of algorithms, Information, Communication & Society, 20:1, 1-13, DOI: 10.1080/1369118X.2016.1216147
Kithcin, R. 2017. Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087
Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1