Final Summary – Algorithms and liveblogs!

I decided the best way to summarize how algorithms have influenced my livestream would be to have a closer look into my fellow students’ conclusions on algorithms and comparing them with my own experiences.

Screenshots taken from algorithmic plays of EDC 2020 students






They played with You Tube, Netflix, Google, etc., algorithms and meddled with their recommendation feeds (video, audio, friends, ads, courses). The most interesting fact was that MSCEDC students had very similar experiences and many recurring conclusions on algorithms I similarly noticed.

  • Technology is not a passive “tool” as it used to be considered at the origins of Community Cultures (see Knox 2015, page 1). Algorithms are still making mistakes but autonomous reactions to personal behavior are identifiable.
  • Algorithms are shaping our reality. The “AI” is not the unknown force as it was considered during Cybercultures. Humans openly accept machines to take decisions for them.

  • Algorithms are in many cases not as objective as originally intended and show clear evidence of bias by economical or other interests. Due to their multilayered and highly complex design the authority behind the algorithm is non-transparent (see Williamson 2017 and Knox 2015, page 1


  • Although many algorithmic propositions seem misleading, wrong or deficient, there is a suspicion that they are still intended by hidden interests.
  • The entanglement of the agencies is leading to a reduction of cultural diversity and the production of knowledge (Knox 2015).
  • The assumption of the autonomous learner as guiding principle for the design of algorithms.
  • Algorithms provide a predetermined path, supporting a “goggle vision” leading to reduction of variability of search results and looping.

When developing and implementing my livestream, I witnessed most of these conclusions myself. I used You tube, twitter and google search frequently for research and their algorithms influenced my selection, my reality and therefore the outcome of my liveblog. “You Loops” or “Echoing” was frequently observable, inefficient results and predomination of certain information sources detectable. Troubleshooting of malfunctioning ITTT algorithms was difficult to solve due to multilayered connections. This reality shaping power algorithms have were well described by Kitchen (2007, page 15).

Although we tend to struggle with the accuracy or the missing information about hidden agendas, the use of algorithm has – and will have – a potential to influence learning and teaching.

The algorithm:

  • knows the learner and provides helpful alternatives (“You have skipped or stayed  longer than average on this page, maybe you want to have a look into …”)
  • shows additional information of whatever format to support learning (“ you seem to like this, maybe have a look into…”)
  • changes the interface, course thumbnails, etc. due to user preferences
  • using educative nudges to make learners make favorable decisions (see Knox et al. 2020, p. 39)
  • works in symbiosis with the teacher as provider of unfiltered or partially filtered information.


In order to work with algorithm-based technology educationalists must maintain a critical perspective. The algorithm is currently shaping the culture to a great extend and education is not excluded. Therefore, teachers should accept the new status quo and openly engage with the technology.

Developing a liveblog is a good start!







Kitchin, R. (2017):  Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29

Knox, J. (2015): Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. in Encyclopedia of Educational Philosophy and Theory. M.A. Peters (ed.)

Knox, J. (2015): Community Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Knox, J., Williamson, B. & Bayne, S. (2020): Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning,Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Schmidt, E. and Rosenberg, J. (2018): How Google Works – Eric Schmidt and Jonathan Rosenberg, Retrieved from:, 29.03.2020

Williamson, B. (2017): Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice.

Algorithmic play of all MSCEDC 2020 students. Thank you very much!








End Week Summary 9 – Who’s algorithm is it anyway?

While my algorithm play started to show results  as the Netflix algorithm, started to advertise pieces of its audio- visual fundus I have never seen before, while hiding others from my view, the question of the role of algorithm based AI in education (now and in the future) and the play started to connect.

Today digital learning is strongly linked with the idea of increased learning effectivity and efficiency. Automated algorithms are hereby considered as helpful tools, which collect huge quantities of data of the learners behaviour in order to focus educational content on these weaknesses. The center of an algorithm based education is the learner and its weaknesses to cope with a given curriculum. But is this all education and teaching is about?  The idea of the independent learner,  knowing exactly what she or he wants and needs to learn is a myth. Knox, WIlliamson and Bayne (2020) provide a consistent assessment on this neoliberal revisioning of the education sector by refering to Biestas identification  of “learnification” and is implications on the digital education future.

“Not only is the figure of the learner placed
at the centre of the educational arrangement, but the individual becomes the site of learning.”


“Learnification is portrayed as blind to broader questions about the role and purpose
of education in wider society,(…)”

I see comparision of this to the algorithms in the digital world. Netflix algorithms using my online behaviour only to shape my profile to be more effective (make me wach more movies?!?). There is no other interest beyond that.

So will the algorithm based AI embedded in digital education systems, apps or programs replace the teacher in future? No, the teacher – learner relationship is more then just effectiveness. It is about supporting, guiding, leading, allowing errors, building personality, shaping personality, etc and or as well as of course the success or achievement of learning targets.

The algorithm in many applications is in most cases focussing only on the achievement of the highest gains but the question who’s gains remains.

“Neoliberal business philosophies and practices promoted by corporations and their partner foundations, supported by international organizations, financiers, and bankers, and welcomed, or at least tolerated by compliant governments, are trying to transform education from a government responsibility and social right into investment opportunities.”(Huff Post (2017)



Jeremy Knox, Ben Williamson & Sian Bayne (2020) Machine behaviourism:
future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning,
Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

A Netflix Algorithmic Play

I have decided to play with my Netflix account and transform it (with a little help from the Netflix algorithm) into a romantic movie selection.

I am trying to prove that Netflix collect tons of data of each user, uses the information to develop a individual profile and uses it to develop my profile in a way to raise my interest in waching more and more movies. The algorithm creates a world of its own interpretation and is therefore not objective. Maybe the word “prove” is not fully correct as it is a fact well known about Netflix.

I used steller to develop a day to day screenshot diary and posted the results via Pinterest on Twitter and my Liveblog.

Have a look into the mixed results of my algorithmic play.

Link to Pinterest 


or on steller