Week 10 Review

This is the week I spent some time on the Lifeblog and hopefully I’ve tidied it up in some sort of week 10semblance of order and sequence.

I enjoyed this course and in particular, I enjoyed the talent, support and cooperation from my classmates. I felt lucky to be surrounded by such a nice bunch of people. I felt that our class tutor did a good job with weaving, summarising and demonstrating online presence.

I really enjoyed my classmate’s artefacts this week and commented on them via the links below.

Best of luck everyone.

My comments on classmate’s posts this week.

Charles Boyle

“A major ethical issue is that YouTube’s algorithm was not designed to help visitors find videos of what they’re looking for but, according to Chaslot (a former YouTube AI engineer), to get them addicted to YouTube”

My comment (to be appoved) link will appear here.

David Yeats

Humorous Podcast on his play with the Soundcloud account- tried to alter the soundlcloud playlist based on the title algorithm in it. Hundreds of songs with the name algorithm in it, but the recommender algrotihm was slow to react- possibly due to the fact that many recommender algorthims still depend on humans to spot the changing trend and change the algorithm.

My comment here

Iryna Althukova used her many (26) international connections to enter the same search terms to see if they would get different results in different parts of the world. I was surprised to learn that no matter where you are in the world, with the exception of China, you get the same search results which are paid for by the big players like Udemy, code academy, Coursera and Edex with few local deviations.

My comment here

JB Fallise

Used a shared you tube account to mess with the algorthim. This emphasisd the fact that In shared devices, the algorithm can never really customise the content, because too many people are influencing the algorithm with diverse choces.. In terms of educational usages, we will find a digital divide here where the less well off students who share devices will not get content catered to ‘what the algorithm thinks they need’. It might not be a bad thing!

My comment here

Michael Wolfindale

Played with Coursera and how the algorithm recommended new courses to him based on choices, and noted that many choices were nudging him towards a westernised Silicon Valley viewpoint of education- these services may end up creating a good fit between you and your media by changing … you

My comment here

Sean Flowers did several plays on several platforms including Facebook and Amazon

My comment here

 

Valerian Muscat– a huge volume of work completed in a short time.  He made some important commentary on the concept of nudging, and particular it’s relevance to education.

My comment here

 

Week 9 Review

Week 9 was the second week of our algorithmic culture block and week of our algorithmic play artefact, located here. In the artefact I discuss how Instagram makes suggestions for me based on my search history, my demographic, and my previous likes.

It was also the week we had to leave work and set up home offices; as IT support staff that meant 12 hr days for myself and colleagues to get through the calls for support from faculty and staff in all aspects of online teaching. As my classmate Sean has mentioned though, the global situation is a game changer for our sector, leading to huge opportunities.

In a roundup of this week’s blog additions, I pulled in some posts related to my Instagram artefact. As the posts on ‘the AI generator of fictitious faces’- the post shows how easy it has become for AI to generate an influencer type photograph that will get lots of instagram ‘likes’, based on social norms for ‘what is considered attractive’ – 10,000 fake faces were created by the algorithm over several hrs.

In addition the posts on Instagram’s use of AI here and here, demonstrate how technology can no longer be seen as the passive instrument of humans, but it is being used as an agent by humans and non humans to “nudge towards the predetermined path as designed by the algorithm” Knox et al 2020.

Seeing as there is an “increased entanglement of agencies in the production of knowledge and culture” (Knox 2015), it is very hard to drill down to see who is managing or benefiting from the behavioural governance of these algorithms. This is particularly pertinent to education and the influencing factors on young people. From a privacy or discriminatory point of view, Educational transcripts, unlike credit reports or juvenile court records, are currently considered fair game for gatekeepers like colleges and employers, as noted in this post, and information from these can be used to implement unfair bias against potential scholarship entrants or employees.

Algorithmic code is influenced by “political agendas, entrepreneurial ambitions, philanthropic goals and professional knowledge to create new ways of understanding, imagining and intervening” in our decision making (Williamson 2017)

Therefore we should maintain a critical perspective on algorithmic culture and continue to question the “objectivity and authority assumed of algorithms” (Knox 2015), and how they are shaping, categorising and privileging knowledge places and people. I’m glad to see the that there is a growing awareness of algorithms that appear to discriminate against people based on age, race, religion, gender, sexual orientation or citizenship status, as noted in the development of  an ‘accountability bill’ in the US. Culpability can be difficult to prove in these cases due to the way that companies remain very secretive about how their models work, but it is a start.

References for the readings

Kitchin, R. (2017) Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Knox J., Williamson B., & Bayne S., (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice

Older Parents and technology

Global situation means long but rewarding working days for those of us supporting teachers creating online lectures and assessment plans. Glad to help. Take a moment to cheer on our isolated older population as they navigate the internet! #mscedc
https://t.co/SRSwwM9elr