Final Summary – Algorithms and liveblogs!

I decided the best way to summarize how algorithms have influenced my livestream would be to have a closer look into my fellow students’ conclusions on algorithms and comparing them with my own experiences.

Screenshots taken from algorithmic plays of EDC 2020 students

 

 

 

 

 

They played with You Tube, Netflix, Google, etc., algorithms and meddled with their recommendation feeds (video, audio, friends, ads, courses). The most interesting fact was that MSCEDC students had very similar experiences and many recurring conclusions on algorithms I similarly noticed.

  • Technology is not a passive “tool” as it used to be considered at the origins of Community Cultures (see Knox 2015, page 1). Algorithms are still making mistakes but autonomous reactions to personal behavior are identifiable.
  • Algorithms are shaping our reality. The “AI” is not the unknown force as it was considered during Cybercultures. Humans openly accept machines to take decisions for them.

  • Algorithms are in many cases not as objective as originally intended and show clear evidence of bias by economical or other interests. Due to their multilayered and highly complex design the authority behind the algorithm is non-transparent (see Williamson 2017 and Knox 2015, page 1

 

  • Although many algorithmic propositions seem misleading, wrong or deficient, there is a suspicion that they are still intended by hidden interests.
  • The entanglement of the agencies is leading to a reduction of cultural diversity and the production of knowledge (Knox 2015).
  • The assumption of the autonomous learner as guiding principle for the design of algorithms.
  • Algorithms provide a predetermined path, supporting a “goggle vision” leading to reduction of variability of search results and looping.

When developing and implementing my livestream, I witnessed most of these conclusions myself. I used You tube, twitter and google search frequently for research and their algorithms influenced my selection, my reality and therefore the outcome of my liveblog. “You Loops” or “Echoing” was frequently observable, inefficient results and predomination of certain information sources detectable. Troubleshooting of malfunctioning ITTT algorithms was difficult to solve due to multilayered connections. This reality shaping power algorithms have were well described by Kitchen (2007, page 15).

Although we tend to struggle with the accuracy or the missing information about hidden agendas, the use of algorithm has – and will have – a potential to influence learning and teaching.

The algorithm:

  • knows the learner and provides helpful alternatives (“You have skipped or stayed  longer than average on this page, maybe you want to have a look into …”)
  • shows additional information of whatever format to support learning (“ you seem to like this, maybe have a look into…”)
  • changes the interface, course thumbnails, etc. due to user preferences
  • using educative nudges to make learners make favorable decisions (see Knox et al. 2020, p. 39)
  • works in symbiosis with the teacher as provider of unfiltered or partially filtered information.

 

In order to work with algorithm-based technology educationalists must maintain a critical perspective. The algorithm is currently shaping the culture to a great extend and education is not excluded. Therefore, teachers should accept the new status quo and openly engage with the technology.

Developing a liveblog is a good start!

 

 

 

 

 

References:

Kitchin, R. (2017):  Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29

Knox, J. (2015): Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. in Encyclopedia of Educational Philosophy and Theory. M.A. Peters (ed.)

Knox, J. (2015): Community Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Knox, J., Williamson, B. & Bayne, S. (2020): Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning,Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Schmidt, E. and Rosenberg, J. (2018): How Google Works – Eric Schmidt and Jonathan Rosenberg, Retrieved from: https://www.alexjhughes.com/books/2018/3/11/how-google-works-eric-schmidt-and-jonathan-rosenberg, 29.03.2020

Williamson, B. (2017): Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice.

Algorithmic play of all MSCEDC 2020 students. Thank you very much!

 

 

 

 

 

 

 

End Week Summary 8 – Algorithm play

Finally approaching the future of education. AI based robots or algorithm based AI systems have introduced and changed many of our private spaces. Knowingly or unknowingly algorithms react to our every virtual step, click, stay, buy, watch or else and create a persona of us and our potential preference, to advertise objects we may like or even better may want o buy.

Google, Facebook, Amazon, Netflix, You Tube  but also other plattforms like coursera or edex do collect data about our behaviour on their pages. Selling their idea of massive data collection with a algorithm which provide us with thinks we may like, something more personal/ individually. And yes for those of us who might have compared google results with a more discrete (non collecting) search engine like Startpage

Yes, it can be very comfortable to have google knowing where you are, where you usually go, what you usually search for and it can be much more frustrating to do the same with a (mostly) non learning search engine but still this comed with a price. Google collects and sells your information, netflix prevents you from other maybe good movies due to your personal preferences, Amazon offers you always similar items or even the MOOCs you’ve been offered are also coming from the same hosts.

In my algorithm play I demonstrate the power of the netflix algorithm, which is actively guiding, influenceing or even forcing me to watch certain movies.

So what does this do with algorithms, AI and education? Will there be robot teachers replacing human teachers in a sort of (for teachers) dystopian vision of the future? Most argue that this will not be the case but educationalists need to admit that they have to open themself for their new “robo” colleagues, who could (and will) deliver or take over certain activities while others will remain with the human teachers.

Freedom from routine, time-consuming tasks will allow teachers to devote more of their energies to the creative and very human acts that provide the ingenuity and empathy needed to take learning to the next level. (Luckin et al. 2016, p.31).

But the education sector needs to understand, criticize and work with these algorithm driven AI systems much more systematically as they do currently.

 

References:

Siân Bayne, Peter Evans, Rory Ewins, Jeremy Knox, James Lamb, Hamish Macleod, Clara O’Shea, Jen Ross, Phil Sheail, Christine Sinclair (2019): The Manifesto for teaching online (DRAFT)