- What kind of algorithms have been involved, and how do you perceive them to have operated? How were your activities influenced?
- How do the identified algorithmic operations relate to particular readings from this block? Which ideas from the readings help you to explain what might have been happening?
I mainly used Google, Youtube, and Twitter to feed into my lifestream so searching, filtering and recommending systems have been involved. However, unlike many hypes about ‘personalized’ algorithms, recommendations based on big data did not work for me. For example, I searched and clicked articles ‘criticism about TEL or LA’, I rarely found the relevant materials. Rather, advantages and commercials of edutech were prioritized, located at the top of the pages. In this sense, digital data and algorithms may not be free from ‘political agendas, commercial interests, entrepreneurial ambitions’ as Williamson(2017) claims and lead to click unrelated materials which may be induced for their (Google, Youtube) own sake, so as Kitchin(2017, 18) puts it, ‘algorithms are created for purposes that are often far from neutral: to create value and capital; to nudge behavior and structure preferences in a certain way’.
However, on Twitter, it was a bit different although recommended feeds still did not work for me. There were peers’ provoking, reflexive ideas which enabled me to rethink themes and search for inspiring articles.
However, in general, because of these impersonalized algorithms, I needed to dedicate much more time than I expected to find materials that match my interests. It was hard and time-consuming for me as a non-native English speaker, because I should read each article in sequence suggested by algorithms to identify whether or not articles matched with my interests, which was not easy.
In this sense, the searching and ranking algorithms were virtually obstacles, rather than guides for me.
- What does this algorithmic activity potentially mean for notions of learning and teaching, authorship and agency?
I tried not to passively consume the information that algorithms suggested, but interpret, evaluate and defy. Participating in (re)developing algorithms is also important in terms of human agency over algorithms, although I am not sure if it helped, I clicked the ‘like button’ of the videos YouTube, linked articles on Twitter and Facebook which had critical views on digital culture, so I hope it works. And because of this agency, I think I have a ‘co-authorship’ of my blog with writers of the linked articles as I tried to share, remix, and make meaningful sense of the articles.
In this regard, this lifestream blog does not fully represent my learning journey because time and effort of my decision regarding which things were included and which were not, were unseen. So I would describe my blog as a photo album containing photographs captured at one moment during a learning journey.
However, if without ideas from reading lists, peers’ activities and teachers’ provoking comments, I would not have the agency over algorithms. Regarding digital technology, each fact and idea may be right, born with a good purpose. But teaching and learning may be about which value we should stand for and how we participate in algorithmic culture as a student and an educator. Unless, the agency over algorithms would be exclusive property for the central government, top universities, and mega-corporations which can access and possess our data which may exacerbate the existing inequalities in our society.
Reference
Kithcin, R. 2017. Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087
Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice. Sage.
Williamson, B. 2019. Automating mistrust. Code Acts in Education.