[ Week 10 ] Final Summary: how algorithms have influenced my lifestream blog

  • What kind of algorithms have been involved, and how do you perceive them to have operated? How were your activities influenced?
  • How do the identified algorithmic operations relate to particular readings from this block? Which ideas from the readings help you to explain what might have been happening?

I mainly used Google, Youtube, and Twitter to feed into my lifestream so searching, filtering and recommending systems have been involved. However, unlike many hypes about ‘personalized’ algorithms, recommendations based on big data did not work for me. For example, I searched and clicked articles ‘criticism about TEL or LA’, I rarely found the relevant materials. Rather, advantages and commercials of edutech were prioritized, located at the top of the pages. In this sense, digital data and algorithms may not be free from ‘political agendas, commercial interests, entrepreneurial ambitions’ as Williamson(2017) claims and lead to click unrelated materials which may be induced for their (Google, Youtube) own sake, so as Kitchin(2017, 18) puts it, ‘algorithms are created for purposes that are often far from neutral: to create value and capital; to nudge behavior and structure preferences in a certain way’. 

However, on Twitter, it was a bit different although recommended feeds still did not work for me. There were peers’ provoking, reflexive ideas which enabled me to rethink themes and search for inspiring articles.

However, in general, because of these impersonalized algorithms, I needed to dedicate much more time than I expected to find materials that match my interests. It was hard and time-consuming for me as a non-native English speaker, because I should read each article in sequence suggested by algorithms to identify whether or not articles matched with my interests, which was not easy. 

In this sense, the searching and ranking algorithms were virtually obstacles, rather than guides for me.

  • What does this algorithmic activity potentially mean for notions of learning and teaching, authorship and agency?

I tried not to passively consume the information that algorithms suggested, but interpret, evaluate and defy. Participating in (re)developing algorithms is also important in terms of human agency over algorithms, although I am not sure if it helped, I clicked the ‘like button’ of the videos YouTube, linked articles on Twitter and Facebook which had critical views on digital culture, so I hope it works. And because of this agency, I think I have a ‘co-authorship’ of my blog with writers of the linked articles as I tried to share, remix, and make meaningful sense of the articles.

In this regard, this lifestream blog does not fully represent my learning journey because time and effort of my decision regarding which things were included and which were not, were unseen. So I would describe my blog as a photo album containing photographs captured at one moment during a learning journey.

However, if without ideas from reading lists, peers’ activities and teachers’ provoking comments, I would not have the agency over algorithms. Regarding digital technology, each fact and idea may be right, born with a good purpose. But teaching and learning may be about which value we should stand for and how we participate in algorithmic culture as a student and an educator. Unless, the agency over algorithms would be exclusive property for the central government, top universities, and mega-corporations which can access and possess our data which may exacerbate the existing inequalities in our society. 

 

Reference

Kithcin, R. 2017. Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice. Sage.

Williamson, B. 2019. Automating mistrust. Code Acts in Education.

Comment on Algorithm Play by jkwon

Thank you for sharing your artifact, Monica!
Your point-Pariser’s idea that media is shaping us- is especially inspiring to me.
Given the fact that algorithms are unseen and black-boxed, we are unlikely informed of which factors are working in recommendation systems. So it may be almost impossible to define who is held accountable for and we are getting passive consumers of information. In school settings, teachers can intervene in the educational process, but what about informal education outside classrooms? While playing with YouTube, Pinterest, Google without critical thinking, we may be led by unseen power relations that reinforce their own interests.

from
Comments for Monica Siegenthaler’s EDC lifestream https://ift.tt/2UcLKWJ
via IFTTT

Comment on Algorithmic play by jkwon

I love your artifact, thank you for sharing, Susanne!
While doing this algorithmic play, I had the same experiences as yours.
I never clicked and watched coronavirus, because I’m Korean living in China at the moment, so I’m tired of information from Korea and China. But youtube, twitter are suggesting me to see coronavirus-related videos and twits, so I just thought it was topical. But although you searched coronavirus on Google, but not so many contents popped up which is weird.
Given the circumstance that algorithm is black-boxed, it is inevitable that I have a doubt social media’s algorithms may just make us click more without contexts so that they can strengthen their positions on the internet world.

from
Comments for Susanne MacLeod’s EDC lifestream https://ift.tt/33CrN05
via IFTTT

Comment on block 3 artefact – my Eurafrican Youtube algorithmic play by jkwon

I really enjoyed your artifact, JB!
While doing my algorithmic play, I also felt that the digital divide is likely affecting the search results. People(or countries) who have stable access to the internet can participate more, in watching and creating videos that lead to biased search results. Some countries don’t allow their citizens to access YouTube, Google, etc., students in less privileged areas don’t have digital devices so their interests may be less reflected than their counterparts. That’s why we shouldn’t just be consumers of the results of algorithms, but be participants in all process of algorithm process.
Thank you for sharing your artifact!

from
Comments for JB’s EDC lifestream https://ift.tt/2TVCU0z
via IFTTT

[ Week 9 Summary] Human involvement is needed in the algorithmic culture

This article reports about how we use learning analytics considering the racial bias and limitations embedded in algorithms. Georgia State University has not only introduced learning analytics but also tripled the number of student advisers so that humans, not just data and algorithms can actively intervene, and consult with students in person. However, these efforts are not enough because unseen, but powerful power is behind and embedded in algorithms.

Whether a certain search result is left or gone is decided by for-profit corporations that can collect and access a huge volume of data, the school administration may use unproven algorithms to monitor students which may lead to more unequal, stigmatized influence on students. In addition to this, if students can use tech-powered personalized learning, and students in low-income areas cannot, it may perpetuate injustice in (re)developing algorithms ignoring a variety of contexts as well as accessing fancy devices and software. And this problem of inequality applies to the contrary case. If personalized learning applies to students in less privileged backgrounds only to handle the lack of teachers, then they would lose the luxury of face to face interaction with teachers. 

Maybe the most important feature of datafication, algorithmic culture is the character of ‘unseen’ which is also be used to hidden control systems, so it is not enough to intervene in the interpreting process. We should participate in remaking algorithms to make it more fair and meaningful, teach our students algorithmic literacy, and claim our rights to algorithms

Under a Watchful Eye: Colleges are using big data to track students in an effort to boost graduation rates, but it comes at a cost

Do data really matter to us although they are collected from past behaviors and other people? It may ignore our strong will and various contexts we are located in. I think this article shows us that human involvement is essential to interpret the result of algorithms’ working. In this regard, the results of learning algorithms can never be the results, but the starting point to educate our students. 

#mscedc #algorithm #educationhttps://t.co/KEKJXyqcyS

— Kwon Jiyoung (@KwonJiy54151403) March 4, 2020

from Twitter https://twitter.com/KwonJiy54151403

March 04, 2020 at 06:55PM
via IFTTT

10 things we should all demand from Big Tech right now https://t.co/SYkUz0rVxd via @voxdotcom #mscedc #algorithmic culture #ethical issues #what we need in algorithmic world

from Twitter https://twitter.com/KwonJiy54151403

March 05, 2020 at 10:16PM
via IFTTT