Week 10 Review

This is the week I spent some time on the Lifeblog and hopefully I’ve tidied it up in some sort of week 10semblance of order and sequence.

I enjoyed this course and in particular, I enjoyed the talent, support and cooperation from my classmates. I felt lucky to be surrounded by such a nice bunch of people. I felt that our class tutor did a good job with weaving, summarising and demonstrating online presence.

I really enjoyed my classmate’s artefacts this week and commented on them via the links below.

Best of luck everyone.

My comments on classmate’s posts this week.

Charles Boyle

“A major ethical issue is that YouTube’s algorithm was not designed to help visitors find videos of what they’re looking for but, according to Chaslot (a former YouTube AI engineer), to get them addicted to YouTube”

My comment (to be appoved) link will appear here.

David Yeats

Humorous Podcast on his play with the Soundcloud account- tried to alter the soundlcloud playlist based on the title algorithm in it. Hundreds of songs with the name algorithm in it, but the recommender algrotihm was slow to react- possibly due to the fact that many recommender algorthims still depend on humans to spot the changing trend and change the algorithm.

My comment here

Iryna Althukova used her many (26) international connections to enter the same search terms to see if they would get different results in different parts of the world. I was surprised to learn that no matter where you are in the world, with the exception of China, you get the same search results which are paid for by the big players like Udemy, code academy, Coursera and Edex with few local deviations.

My comment here

JB Fallise

Used a shared you tube account to mess with the algorthim. This emphasisd the fact that In shared devices, the algorithm can never really customise the content, because too many people are influencing the algorithm with diverse choces.. In terms of educational usages, we will find a digital divide here where the less well off students who share devices will not get content catered to ‘what the algorithm thinks they need’. It might not be a bad thing!

My comment here

Michael Wolfindale

Played with Coursera and how the algorithm recommended new courses to him based on choices, and noted that many choices were nudging him towards a westernised Silicon Valley viewpoint of education- these services may end up creating a good fit between you and your media by changing … you

My comment here

Sean Flowers did several plays on several platforms including Facebook and Amazon

My comment here

 

Valerian Muscat– a huge volume of work completed in a short time.  He made some important commentary on the concept of nudging, and particular it’s relevance to education.

My comment here

 

Week 8 Review

During week 8, I began with readings on algorithmic cultures and education, several are discussed below. I also started ‘playing with algorithms’ task. I chose to look at Instagram and how the Instagram feed tells me what to like and who to follow. Lastly  I looked at musings from the web that might help me understand discussions around algorithms and how they are shape our everyday learning lives.

Review of the main readings;

Williamson 2017 and Knox 2015 articles gave a good overview of the operation of web algorithms, and the ways these automated, non-human agents influence contemporary educational practices.

Both papers underlined how complex a job it is to understand how algorithims are critically analysed to see how they are influencing us, because there are so many interweaving agendas – “Businesses with products to sell, venture capital firms with return on investment to secure, think tanks with new ideas to promote, policy makers with problems to solve and politicians with agendas to set have all become key advocates for data driven education” Williamson 2017.

Kitchin follows up on this theme, explaining that algorithms are usually “woven together with hundreds of other algorithms to create algorithmic systems, and the rules generated by them are compressed and hidden”. They are “works of collective authorship, made, maintained, and revised by many people with different goals at different times”, (Kitchin 2017) and they are embedded in complex socio-technical assemblages. Therefore we do not encounter algorithmic generative rules in a clear manner and in a way that makes it easy to understand them.

In the paper on machine behaviourism: future visions of ‘learnification’ and ‘datafication’ (Knox et al, 2020) they explain the growing influence of behavioural psychology in the educational sector and how it interacts with datafication and machine learning to nudge education towards new forms of behavioural governance. The Knox et al paper did a good job in defining terms such as digital choice architectures, behavioural psychology, behavioural economics, machine learning, & learnification and these are now added to my terminology page. Behavioural governance can work “against notions of student autonomy and participation, seeking to intervene in educational conduct and shaping learner behaviour towards predefined aims”. (Knox et al 2020)

The Knox et al 2020 paper covers a huge amount of ground as it looks into the future of datification. They explained that Learnification theory (Biesta 2015), where the learner is the (potential) consumer, whose needs are being met by ‘education’, will soon be less dominant in education.

With the rise of datification, the learner is becoming more ‘modelled’ and therefore so too is the ability of machine learning to  ‘predict’ the learner.  When one can predict a human’s next steps, it becomes easier to manipulate those next steps.  To compound the issue, learners come into education not really knowing what their preferences are, therefore they are easier to nudge towards the predetermined path as designed by the algorithm.  Knox et al state “Here, learners are assumed to respond directly to what the dashboard reveals, rather than evoking some kind of consumerist desire. “

Knox et al (2020) describe this as a ‘crucial shift’ away from Biesta’s learnification model. In the future we will become more influenced by behavioural psychology and algorithmic generative rules (Kitchin 2017) that nudge us towards ‘correct’ forms of performance and conduct that have already been decided (Knox et al 2020).

Activity online- Critical research on machine learning can be negative, so it was nice to find this article on from Data Science for Social good” which demonstrates a positive use of machine learning for social good and education. It describes the development of an algorithm that assigns a fire risk score to each property on the fire department’s inspection list.

The ‘bias of algorithm’s article here reminds me that “We must not assume the transparency and necessity of automation”. (Knox 2015), and to maintain a “more general, critical account of algorithms, their nature and how they perform work” (Kitchin 2017).

On the article about ‘kids growing up with algorithms’; Kitchin describes this kind of algorithmic play in his paper, discussing how he would like to see us “explore the ways in which people resist, subvert and transgress against the work of algorithms, and re-purpose and re-deploy them for purposes they were not originally intended”. Kids would be the best kind of subversive players I think.

 

References for the readings

Kitchin, R. (2017) Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Knox J., Williamson B., & Bayne S., (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice

Week 7 Review

This week we concentrated on completing the micro ethnography on ‘community membership’ inside a MOOC.

The ethnography:  I placed my ethnography here and was glad to see that some classmates commented on it.  As noted in my comments, my chosen MOOC had been laid out in a clear and sequential way: the curriculum was set, students had a clear path of progression, and discussions were optional. However this strict structure did not allow much room to manoeuvre in terms of co-creation, collaboration, teaching presence and social presence.

Whilst a MOOC has much to offer in terms of breaking down barriers to accessing education, allowing increased class sizes, and exposure to many cultures, the MOOC does not straightforwardly deliver education in the way that many institutions are promising.  Missing is much of the educational experience of a full time online course. Aspects of the community of inquiry model, so important to the creation of an online culture, example sharing personal meaning, collaboration, connecting ideas and exchange of information, is very difficult to achieve on a MOOC. There are exceptions to this if you intrinsically motivate your students to participate. The interplay between the extrinsic forces acting on persons and the intrinsic motives and needs inherent in human nature is the territory of Self-Determination Theory. When a MOOC achieves the delicate balance of convincing students that they want to participate, then that MOOC is on to something.

Peer Interactions & ethnographies:  I spent some time in the last day or two looking at my classmates ethnographies which were as broad and diverse as a vibrant Arabian marketplace. The quality of the artefacts makes me quite proud of being part of such a talented group. It was interesting to see how different people focused on both specific interactions and/or broad scope.

I will continue to comment on classmates ethnographies as they go up but the several comments I made on their work are on the links below.

https://edc20.education.ed.ac.uk/jjack/2020/03/02/ethnographic-object/

https://edc20.education.ed.ac.uk/msiegenthaler/2020/03/02/micro-ethnography/

https://edc20.education.ed.ac.uk/dyeats/2020/02/27/micro-netnographic-artefact-community-pushing-through-the-cracks/

https://edc20.education.ed.ac.uk/mwolfindale/2020/02/28/micro-ethnography-entangled-communities/

https://edc20.education.ed.ac.uk/vmuscat/2020/03/01/micro-netnography-artefact/

https://edc20.education.ed.ac.uk/jkwon/2020/03/02/mscedc-this-is-the-link-of-my-microethnography-but-i-am-a-bit-embarrassed-having-seen-other-classmates-wonderful-outcomes-anyway-i-hope-everyone-enjoyed-this-artefact-https-t-co-jnmmjh2/

AdriVivio has shared a tweet

@ThomasReinhard7 These are all excellent strategies and would work very well in class sizes of 30 or less. These methods offer a personalised experience and happier students. The amount of forum weaving for teachers might make it difficult for big classes #mscedc.

AdriVivio -reply on twitter

@DavidYeats3 Ed tech won’t solve the problem of academics administering high scores. Cash strapped Uni’s are competing for students; higher scores minimise dropout rates. Graduates though, are not meeting the needs of employers. Apprenticeships might work better. #mscedc http://twitter.com/AdriVivio/status/1234472606776135682