Final Summary – Algorithms and liveblogs!

I decided the best way to summarize how algorithms have influenced my livestream would be to have a closer look into my fellow students’ conclusions on algorithms and comparing them with my own experiences.

Screenshots taken from algorithmic plays of EDC 2020 students

 

 

 

 

 

They played with You Tube, Netflix, Google, etc., algorithms and meddled with their recommendation feeds (video, audio, friends, ads, courses). The most interesting fact was that MSCEDC students had very similar experiences and many recurring conclusions on algorithms I similarly noticed.

  • Technology is not a passive “tool” as it used to be considered at the origins of Community Cultures (see Knox 2015, page 1). Algorithms are still making mistakes but autonomous reactions to personal behavior are identifiable.
  • Algorithms are shaping our reality. The “AI” is not the unknown force as it was considered during Cybercultures. Humans openly accept machines to take decisions for them.

  • Algorithms are in many cases not as objective as originally intended and show clear evidence of bias by economical or other interests. Due to their multilayered and highly complex design the authority behind the algorithm is non-transparent (see Williamson 2017 and Knox 2015, page 1

 

  • Although many algorithmic propositions seem misleading, wrong or deficient, there is a suspicion that they are still intended by hidden interests.
  • The entanglement of the agencies is leading to a reduction of cultural diversity and the production of knowledge (Knox 2015).
  • The assumption of the autonomous learner as guiding principle for the design of algorithms.
  • Algorithms provide a predetermined path, supporting a “goggle vision” leading to reduction of variability of search results and looping.

When developing and implementing my livestream, I witnessed most of these conclusions myself. I used You tube, twitter and google search frequently for research and their algorithms influenced my selection, my reality and therefore the outcome of my liveblog. “You Loops” or “Echoing” was frequently observable, inefficient results and predomination of certain information sources detectable. Troubleshooting of malfunctioning ITTT algorithms was difficult to solve due to multilayered connections. This reality shaping power algorithms have were well described by Kitchen (2007, page 15).

Although we tend to struggle with the accuracy or the missing information about hidden agendas, the use of algorithm has – and will have – a potential to influence learning and teaching.

The algorithm:

  • knows the learner and provides helpful alternatives (“You have skipped or stayed  longer than average on this page, maybe you want to have a look into …”)
  • shows additional information of whatever format to support learning (“ you seem to like this, maybe have a look into…”)
  • changes the interface, course thumbnails, etc. due to user preferences
  • using educative nudges to make learners make favorable decisions (see Knox et al. 2020, p. 39)
  • works in symbiosis with the teacher as provider of unfiltered or partially filtered information.

 

In order to work with algorithm-based technology educationalists must maintain a critical perspective. The algorithm is currently shaping the culture to a great extend and education is not excluded. Therefore, teachers should accept the new status quo and openly engage with the technology.

Developing a liveblog is a good start!

 

 

 

 

 

References:

Kitchin, R. (2017):  Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29

Knox, J. (2015): Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. in Encyclopedia of Educational Philosophy and Theory. M.A. Peters (ed.)

Knox, J. (2015): Community Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Knox, J., Williamson, B. & Bayne, S. (2020): Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning,Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Schmidt, E. and Rosenberg, J. (2018): How Google Works – Eric Schmidt and Jonathan Rosenberg, Retrieved from: https://www.alexjhughes.com/books/2018/3/11/how-google-works-eric-schmidt-and-jonathan-rosenberg, 29.03.2020

Williamson, B. (2017): Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice.

Algorithmic play of all MSCEDC 2020 students. Thank you very much!

 

 

 

 

 

 

 

End Week Summary 9 – Who’s algorithm is it anyway?

While my algorithm play started to show results  as the Netflix algorithm, started to advertise pieces of its audio- visual fundus I have never seen before, while hiding others from my view, the question of the role of algorithm based AI in education (now and in the future) and the play started to connect.

Today digital learning is strongly linked with the idea of increased learning effectivity and efficiency. Automated algorithms are hereby considered as helpful tools, which collect huge quantities of data of the learners behaviour in order to focus educational content on these weaknesses. The center of an algorithm based education is the learner and its weaknesses to cope with a given curriculum. But is this all education and teaching is about?  The idea of the independent learner,  knowing exactly what she or he wants and needs to learn is a myth. Knox, WIlliamson and Bayne (2020) provide a consistent assessment on this neoliberal revisioning of the education sector by refering to Biestas identification  of “learnification” and is implications on the digital education future.

“Not only is the figure of the learner placed
at the centre of the educational arrangement, but the individual becomes the site of learning.”

and

“Learnification is portrayed as blind to broader questions about the role and purpose
of education in wider society,(…)”

I see comparision of this to the algorithms in the digital world. Netflix algorithms using my online behaviour only to shape my profile to be more effective (make me wach more movies?!?). There is no other interest beyond that.

So will the algorithm based AI embedded in digital education systems, apps or programs replace the teacher in future? No, the teacher – learner relationship is more then just effectiveness. It is about supporting, guiding, leading, allowing errors, building personality, shaping personality, etc and or as well as of course the success or achievement of learning targets.

The algorithm in many applications is in most cases focussing only on the achievement of the highest gains but the question who’s gains remains.

“Neoliberal business philosophies and practices promoted by corporations and their partner foundations, supported by international organizations, financiers, and bankers, and welcomed, or at least tolerated by compliant governments, are trying to transform education from a government responsibility and social right into investment opportunities.”(Huff Post (2017)

 

References:

Jeremy Knox, Ben Williamson & Sian Bayne (2020) Machine behaviourism:
future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning,
Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

End Week Summary 8 – Algorithm play

Finally approaching the future of education. AI based robots or algorithm based AI systems have introduced and changed many of our private spaces. Knowingly or unknowingly algorithms react to our every virtual step, click, stay, buy, watch or else and create a persona of us and our potential preference, to advertise objects we may like or even better may want o buy.

Google, Facebook, Amazon, Netflix, You Tube  but also other plattforms like coursera or edex do collect data about our behaviour on their pages. Selling their idea of massive data collection with a algorithm which provide us with thinks we may like, something more personal/ individually. And yes for those of us who might have compared google results with a more discrete (non collecting) search engine like Startpage

Yes, it can be very comfortable to have google knowing where you are, where you usually go, what you usually search for and it can be much more frustrating to do the same with a (mostly) non learning search engine but still this comed with a price. Google collects and sells your information, netflix prevents you from other maybe good movies due to your personal preferences, Amazon offers you always similar items or even the MOOCs you’ve been offered are also coming from the same hosts.

In my algorithm play I demonstrate the power of the netflix algorithm, which is actively guiding, influenceing or even forcing me to watch certain movies.

So what does this do with algorithms, AI and education? Will there be robot teachers replacing human teachers in a sort of (for teachers) dystopian vision of the future? Most argue that this will not be the case but educationalists need to admit that they have to open themself for their new “robo” colleagues, who could (and will) deliver or take over certain activities while others will remain with the human teachers.

Freedom from routine, time-consuming tasks will allow teachers to devote more of their energies to the creative and very human acts that provide the ingenuity and empathy needed to take learning to the next level. (Luckin et al. 2016, p.31).

But the education sector needs to understand, criticize and work with these algorithm driven AI systems much more systematically as they do currently.

 

References:

Siân Bayne, Peter Evans, Rory Ewins, Jeremy Knox, James Lamb, Hamish Macleod, Clara O’Shea, Jen Ross, Phil Sheail, Christine Sinclair (2019): The Manifesto for teaching online (DRAFT)

End Week Summary_2 – Embrace the AI?

End of week 2_

The question on Cyber- Community and Algorithmic cultures continued to be an essential aspect. Algorithmic cultures is a striking emerging subject, while we still have not overcome the seperating cybercultures and tool providing community cultures. How do we deal with algorithm based decision making? Already we accept it in our google search or in many other apps but if the algorithm starts to decide about your future it is another aspect.

For digital education there are many fields for development and change but the most present is the discussion about using AI in the classroom.

 

In China the next stage of technical development and use has already begun with the use of AI for learning and teaching. While at least in Germany the discussion on the use of AI and the borders and boundaries of data collection is very controversially fought, other countries already in the phase of piloting.

Michael from the #mscedc course found a nice article which puts this discussion in a table. I would like to highlight, that there is always the fundamental question of  good and bad (for learning)  – like in the movies we watched. SO is technology, the technology we use and how we use it good or bad for learning?

The movie “The Intelligence Explosion” demonstrated the distrust in the algorithm based AI / Cyborg by those who are more into ethics. But even its developers seeking for a possibility to make “Gunther” more human. May it be to make the AI better or to increase acceptance from society.

For digital learning in the new century AI based learning software and the way how teacher, learner, developers handle it is crucial for an efficient use of technology.

References: 

Knox, J 2015, Critical education and digital cultures. in M Peters (ed.), Encyclopedia of Educational
Philosophy and Theory. Springer, pp. 1-6. https:/

Karen Hao (2019.08).https://www.technologyreview.com/s/614057/china-squirrel-has-started-a-grand-experiment-in-ai-education-it-could-reshape-how-the/

The Guardian (2017): The Intelligence Explosion. https://www.youtube.com/watch?v=-S8a70KXZlI