Michael commented on Adrienne O Mahoney’s EDC lifestream – ‘Algorithmic Play Artefact’

Algorithmic Play Artefact

Michael Wolfindale:

Great artefact, Adrienne, and really like the annotated screencast format – works brilliantly with the subject matter!

Fascinating (and perhaps sobering!) how you pinned down aspects which were influenced by activity outside of your Instagram account, such as your search history. Reminds me of a quote from Seaver (2013: 10) I came across which builds on the “black box metaphor” to argue that ‘these algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them’:

Seaver, N., 2013. Knowing Algorithms. Presented at the Media in Transitions 8, Cambridge, MA. Available from: https://static1.squarespace.com/static/55eb004ee4b0518639d59d9b/t/55ece1bfe4b030b2e8302e1e/1441587647177/seaverMiT8.pdf [Accessed 11 Mar 2020].

It’s great that you’ve discussed multiple algorithmic systems and the wider context in that sense since, as you also note, it is all too often the case that a handful of big tech companies own many of these different “apps” (such as Facebook owning Instagram). As you draw on from Williamson (2017), the business models, entrepreneurial cultures and commercial and political agendas are all a huge factor here. The Silicon Valley model and associated ideologies are aspects that also came to the fore in my brief “play” with Coursera. Really clear and thorough conclusions here!

Great work – really enjoyed it!

Michael commented on Susanne MacLeod’s EDC lifestream – ‘Algorithmic play’

Algorithmic play

Michael Wolfindale:

Great artefact, Susanne, and really like the scrolling story/presentation style – very appropriate to the endless scrolling we often do on social media!

Interesting to see the kids’ videos appearing, and that you’ve tracked it down to perhaps what was an algorithm change. Also, you reflect on the apparent random nature of some of the results. This all appears to speak to the ‘emergent and constantly unfolding’ (and sometimes random) nature of algorithms that Kitchin (2017: 21) discusses.

As you point out, the non-transparent “black boxed” algorithms, obscured by technical aspects inaccessible for many and further complicated by the messy network of different connections, inputs and reactions – not to mention the surrounding social aspects, commercial agendas, ideologies and so on – makes it very difficult to research algorithmic systems (Kitchin 2017: 21).

Your artefact really highlighted this, and has been a thought provoking and reflective piece of work – thank you!

Michael commented on Charles’s EDC lifestream – ‘Week 9 – Algorithmic Play Artefact’

Week 9 – Algorithmic Play Artefact

Michael Wolfindale:

Really interesting explorations, Charles, and great screenshots/videos and commentary!

It’s fascinating how you reflect on how our devices may be “listening in” on us. The numerous reports in the media about this have terrified me somewhat.

It’s also interesting how the algorithm picked up on your MOOC-related and other activity on this course, making assumptions about your interests, but then this all appeared to be mixed with seemingly random results. Yet, as you point out, YouTube is perhaps not even concerned with that concerned with relevance but that people become addicted and drive traffic to the service (so “engagement” statistics can be sold to advertisers). This reminds me of a podcast/article I heard/read a while back, which argues that crude behaviourist psychology is deployed, comparing it to gambling, which perhaps relates to the ‘machine behaviourism’ discussed in the Knox et al. (2020) paper.

Great artefact, lots of food for thought – thank you!

Michael commented on Val Muscat’s EDC lifestream – ‘Algorithmic Play Artefact’

Algorithmic Play Artefact

Michael Wolfindale:

What a great artefact, Val! So much rich detail around the different algorithmic systems you explored, insightful commentary and really well presented – great visuals!

It’s particularly fascinating to see you reflect on how the models of these algorithmic systems might be applied to education, potentially attempting to “transform” it according to the “Silicon Valley model”.

You also mentioned Coursera, which was the main focus for my artefact. It was perhaps unclear there why courses were being pushed to the top of the list, whether there was someone sponsoring the ranking, and/or whether number of enrolments was a significant factor in the ranking. Certainly, there was a prevalence of “tech” courses in any case, and disciplines which perhaps tend to lead to higher earnings. All of this is perhaps concerning – arguably, just because a Netflix movie is less “watched” or makes less money doesn’t make it any less valid, and the same applies to educational courses. However, you can imagine in an educational setting the courses that “lead to higher earnings”, have higher student numbers and so on being pushed to the top in this model as you mention. This is something we can already see to a degree in university rankingsalthough perhaps framed by a more explicit political agenda. Whereas, the corporate Silicon Valley model (and commercial/political agendas) often seems to be “hidden” underneath slick user interfaces and the problematic notion of “making the world a better place” for all.

Great artefact and very thought provoking – thank you!

Michael commented on Jiyoung Kwon’s EDC lifestream – ‘This is my algorithmic play with youtube’

This is my algorithmic play with youtube, I hope you all have enjoyed this activity! https://t.co/zvHmW3n2Ue #mscedc

Michael Wolfindale:

Fantastic artefact, Jiyoung, and loved the way you presented it through a scrolling timeline (which reminded me somewhat of the feeds we’re infinitely scrolling through!).

Really interesting how you mentioned ‘expansion with no direction’. It reminded me of a post on the Twitter blog praising a change to their own algorithm which resulted in more “engagement” (more tweets/retweets):
https://blog.twitter.com/official/en_us/a/2016/never-miss-important-tweets-from-people-you-follow.html

Supposedly, this is ‘great for everyone’, where perhaps what they mean is it is great commercially, for promoting these figures to their advertisers and so on. Perhaps their pursuit of expansion is motivated predominantly by profit rather than the interests of those using Twitter, all framed in a Silicon Valley notion of ‘progress’. These commercial motivations are a danger affecting learning analytics which you highlight well.

Great work – thank you for an insightful and well presented artefact!

Michael commented on Jon Jack’s EDC lifestream – ‘Algorithmic Play’

Algorithmic Play

Michael Wolfindale:

Brilliant artefact, Jon! Such insightful thoughts to frame your explorations around – ‘the idea of an algorithm as a cultural presence’ (Beer 2017) and the notion of ‘the algorithm as embodied’. Also, some interesting and, at times, amusing results! Perhaps reflecting the current polarisation of views we so often hear about (e.g. ‘Boris Johnson is a toe rag/genius’ being two of those results)!

I did a very small amount of playing with the Google search autocomplete with ‘is edtech…’, and it seemed to largely promote a profit-driven view, rather than a particular focus on education. Brings to mind how sometimes certain views or biases can be reinforced, having looked at Noble’s (2018) ‘Algorithms of Oppression’.

With regards to Spotify, that’s a fascinating insight into how they quantify music around terms like “speechiness” and “valence”…not terms I’d normally think of! I often find my musical tastes are quite unpredictable, while Spotify tries to play “more of the same”. I wonder if this is accounted for in the algorithm, and would be the same for one that picked academic papers for us to read?

Insightful thoughts and detailed explorations presented really clearly – thank you!

Michael commented on Monica Siegenthaler’s EDC lifestream – ‘Algorithm Play’

Algorithm Play

Michael Wolfindale:

Great artefact, Monica, and love the way you’ve laid out your reflections, Q&A and conclusions in Prezi! Very timely too, particularly with all the ongoing events currently.

It’s fascinating how you framed your artefact around the notion of media shaping us and our identities, and also in the context of education how algorithmic systems might lead to a ‘child led curriculum’. It’s also really interesting how the algorithm might be seen in the context of the classroom, and the human-machinic relations occurring there. It reminds me a little of this article which talks about the role of AI in medicineand all made me reflect on where the power and agency lies in all of this. Really thought provoking – great work!

Michael commented on Iryna’s lifestream blog – ‘My algorithmic play artefact’

#mscedc My algorithmic play artefact: https://t.co/n2L6TTwenI

Michael Wolfindale:

Great artefact, Iryna! Really like the way you have made use of ThingLink with the audio commentary, and the map to plot the location-specific results.

Fascinating comment on the ‘goggles’ shaping our behaviour and results, and the slightly unnerving quote from Jonathan Rosenberg about how shaping ‘customer’ behaviour is perhaps entirely intentional (even though Google’s search is arguably presented itself as objective or neutral. It brings to mind my own explorations of Coursera where, through the ‘algorithmic play’ activity (and a glance of the privacy policy) my perception of the site was completely changed. Yet, how many algorithmic systems do I continue to use unthinkingly on a daily basis?

Really nice artefact, and so clearly presented and thought provoking!

Michael commented on JB’s EDC lifestream – ‘block 3 artefact – my Eurafrican Youtube algorithmic play’

block 3 artefact – my Eurafrican Youtube algorithmic play

Michael Wolfindale:

Fantastic and insightful artefact, JB, and really like the way you have published it as a screencast presentation alongside your commentary, allowing us to see your conclusions alongside visuals of your explorations.

As you say, examples of racism and discrimination being generated/reproduced through YouTube and other algorithmic systems is not very difficult to find, but the (perhaps more subtle) assumptions being made about aspects such as identity, shared accounts etc. that you point out is such an important point. During my own explanations of Coursera and other “apps” during this block, I’ve often felt that the way they have been designed creates so many exclusions through these kinds of assumptions, and that a democratic design process that includes a diverse range of voices is so important yet sadly rare.

I’ve been following the notion of platform cooperativism, which may be one possible approach to address this.

Thank you for bringing up such important points, and really enjoyed your artefact!

Michael commented on Teaching@DigitalCultures (David Yeats) – ‘Algorithmic play artefact : teaching@digital podcast’

Algorithmic play artefact : teaching@digital podcast:

Michael Wolfindale:

Fantastic artefact, David!

Really enjoyed both the podcast (some great sounds there!) and the text/screenshot commentary – really insightful. I’d played around with SoundCloud very briefly at the beginning of the block, and glad that you’ve provided such detail framed around Jeremy’s questions. It was also great to be able to engage with your findings and discussion through different mediums!

It’s really interesting how you comment on the potential privileging of a ‘commercial ethic’ in the algorithmic systems of SoundCloud and others, perhaps shaped by other people, and how perhaps SoundCloud might lose out commercially to others such as Spotify due to its lack of immediacy. You also mention how this might all influence algorithmic cultures in education, and I picked up these familiar commercial/competitive models in Coursera too.

> ‘One thing that has stood out for me in the efforts of platforms to personalise our experience by means of recommender algorithms is how this cultural turn influences what is expected of the services of educational institutions’

Absolutely – this is something that I felt while exploring the Coursera recommendation algorithms, tweaking my profile, and also glancing through the privacy policy. The language there was revealing – where ‘Content Providers’ (presumably academic institutions?) make ‘Content Offerings’ to ‘learners’. Presumably, activity on the site from myself and other ‘learners’ will have some influence on what is offer in future, or what is considered commercially viable.

> ‘This may also mark the impact of algorithmic cultures on education. The expectation of immediate adaptation, flexibility and personalisation.’

This is a really interesting point, and again one which I was reflecting on looking at Coursera. Thinking about how the expectations and assumptions of myself and others might have been shaped by Coursera’s algorithmic systems – and how this in turn may have been influenced by commercial algorithmic systems outside of education – does make me feel a little uneasy!

Great work – really enjoyed it!