Disentanglement from my lifestream: wrapping up algorithmic cultures and EDC 2020

‘Entanglement’ (ellen x silverberg, Flickr)

As I disentangle myself from my lifestream feeds, and reflect on the course, I consider how I have perceived and been influenced by the algorithmic systems involved.

Google and Twitter were consistent influences, the latter through new/existing connections and via #mscedc#AlgorithmsForHer and #ds106, and I saved/favourited (often highly ranked) resources to Pocket, YouTube and SoundCloud (and other feeds).

While I had some awareness of these algorithms, alterations to my perception of the ‘notion of an algorithm’ (Beer 2017: 7) has shaped my behaviour. Believing I “understand” how Google “works”, reading about the Twitter algorithm and reflecting on ranking/ordering have altered my perceptions, and reading about ‘learning as “nudging”‘ (Knox et al. 2020: 38) made me think twice before accepting the limiting recommendations presented to me.

Referring to the readings, these algorithmic operations are interwoven with, and cannot be separated from, the social context, in terms of commercial interests involved in their design and production, how they are ‘lived with’ and the way this recursively informs their design (Beer 2017: 4). Furthermore, our identities shape media but media also shapes our identities (Pariser 2011). Since ‘there are people behind big data’ (Williamson 2017: x-xi), I am keen to ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25), uncover ideologies, commercial and political agendas (Williamson 2017: 3) and understand the ‘algorithmic life’ (Amoore and Piotukh 2015) and ‘algorithmic culture’ (Striphas 2015) involved.

During my ‘algorithmic play’ with Coursera, its “transformational” “learning experiences” and self-directed predefined ‘learning plans’ perhaps exemplify Biesta’s (2005) ‘learnification’. Since ‘algorithms are inevitably modelled on visions of the social world’ (Beer 2017: 4), suggesting education needs “transforming” and (implied through Coursera’s dominance of “tech courses”) ‘the solution is in the hands of software developers’ (Williamson 2017: 3) exposes a ‘technological solutionism’ (Morozov 2013) and Californian ideology (Barbrook and Cameron 1995) common to many algorithms entangled in my lifestream. Moreover, these data-intensive practices and interventions, tending towards ‘machine behaviourism’ (Knox et al. 2020), could profoundly shape notions of learning and teaching.

As I consider questions of power with regards to algorithmic systems (Beer 2017: 11) and the possibilities for resistance, educational institutions accept commercial “EdTech solutions” designed to “rescue” them during the coronavirus crisis. This accelerated ‘datafication’ of education, seen in context of wider neoliberal agendas, highlights a growing urgency to critically examine changes to pedagogy, assessment and curriculum (Williamson 2017: 6).

However, issues of authorship, responsibility and agency are complex, for algorithmic systems are works of ‘collective authorship’, ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver 2013: 8-10). As ‘processes of “datafication” continue to expand and…data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4), I return to the concept of ‘feedback loops’ questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2). If human-machinic boundaries are blurred and autonomous will problematic (ibid.: 288), we might consider algorithmic systems/actions in terms of ‘human-machinic cognitive relations’ (Amoore 2019: 7) or ‘cognitive assemblages’ (Hayles 2017), entangled intra-relations seen in context of sociomaterial assemblages and performative in nature (Barad 2007; Introna 2016; Butler 1990) – an ‘entanglement of agencies’ (Knox 2015).

I close with an audio/visual snippet and a soundtrack to my EDC journey


My EDC soundtrack:

My EDC soundtrack cover image

View references

Reflecting back on EDC 2020

As I come to the end of the lifestream blog, I return to questions and aspects I considered early on in the course…

EDC week 3
EDC week 3 (enlarge)

‘Entanglement’ has been a key theme throughout the course – the entangled ‘boundaries of the autonomous subject’ (Hayles 1999: 2), reconsidering the dualisms and false binaries which have increasingly appeared entangled – human/machine, real/virtual, open/closed, public/private and so on…

Dualisms visual artefact
Dualisms visual artefact

Rather than assume I could remain an impartial observer, I became entangled in my “object” of research – the ds106 communities and networks

Miro board
Miro board

Finally, my ‘algorithmic play’ artefact – rather than examine standalone and discrete “black boxes” of code, instead revealed ‘massive, networked ones with hundreds of hands reaching into them’ (Seaver 2013: 10)…

Photo by Federico Beccari on Unsplash
PhotoFederico BeccariUnsplash.

…multiple codebases entangled with one another, with collective authorship but tangled up with a “Silicon Valley ethos”, commercial interests, neoliberal ideologies and specific notions of “progress”…

‘Algorithmic systems: entanglements of human-machinic relations’
‘Algorithmic systems: entanglements of human-machinic relations’

…a messy and complex entanglement of multiple algorithmic systems and human-machinic cognitive relations (Amoore 2019: 7), algorithmic systems with ‘a cultural presence’ (Beer 2017: 11), both modelled on and influenced by ‘visions of the social world’ (ibid: 4).

Contemplating on how we might think about agency in this context encourages me to consider the ‘broader debates about the status of agency as processes of “datafication” continue to expand and as data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4).

This “feeding back” of data into people’s lives in turn brings me back to the concept of ‘feedback loops’ which question the ‘boundaries of the autonomous subject’ (Hayles 1999: 2) and loops me back to the beginning of the course, when I put up my lifestream blog header image (the Mandelbrot set), a visualisation itself created through feedback and iteration

The Mandelbrot set
The Mandelbrot set (created by Wolfgang Beyer with Ultra Fractal, CC BY-SA 3.0).

View references

The end of our first week on cyberculture

Today marks the end of an exciting first week!

Before we started, I set up Twitter, YouTube, SoundCloud and Pocket feeds and shared resources on artifical intelligence embracing social science, machines and cognition, posthumanism and a track and article demonstrating music and algorithms. This mix was intended to test out different feeds and save/share content to revisit later.

I began the first day reflecting on a short clip from Blade Runner, referencing the ‘more human than human’ quote mentioned in the Miller (2011) reading. As I worked through the readings and films, contemplating on the figure of the ‘cyborg’ through Haraway, I reconsidered my assumptions about the boundaries between ‘human’ and ‘machine’. This theme kept cropping up, while looking at the Voight-Kampff test in Blade Runner and during a Twitter exchange about ‘testing’ for a ‘human’.

This ‘human’/‘machine’ boundary was just one assumption I found myself deconstructing, encouraged by Sterne (2006: 24) to question, examine and reclassify categories and boundaries and avoid importing existing biases. Thinking about ‘feedback loops’ and questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2) brought me to this video, and inspired my header image (the Mandelbrot set), a visualisation created through feedback and iteration. I went on to explore posthumanism through videos and readings from Braidotti and Hayles, reconsidering my ideas about autonomous will and the neutrality of the term ‘human’.

My journey this week had tangents (including the #AlgorithmsForHer conference) which I hope to revisit. I’ve sketched out my journey below…

EDC week 1
EDC week 1 (enlarge)

View references