Disentanglement from my lifestream: wrapping up algorithmic cultures and EDC 2020

Entanglement
‘Entanglement’ (ellen x silverberg, Flickr)

As I disentangle myself from my lifestream feeds, and reflect on the course, I consider how I have perceived and been influenced by the algorithmic systems involved.

Google and Twitter were consistent influences, the latter through new/existing connections and via #mscedc#AlgorithmsForHer and #ds106, and I saved/favourited (often highly ranked) resources to Pocket, YouTube and SoundCloud (and other feeds).

While I had some awareness of these algorithms, alterations to my perception of the ‘notion of an algorithm’ (Beer 2017: 7) has shaped my behaviour. Believing I “understand” how Google “works”, reading about the Twitter algorithm and reflecting on ranking/ordering have altered my perceptions, and reading about ‘learning as “nudging”‘ (Knox et al. 2020: 38) made me think twice before accepting the limiting recommendations presented to me.

Referring to the readings, these algorithmic operations are interwoven with, and cannot be separated from, the social context, in terms of commercial interests involved in their design and production, how they are ‘lived with’ and the way this recursively informs their design (Beer 2017: 4). Furthermore, our identities shape media but media also shapes our identities (Pariser 2011). Since ‘there are people behind big data’ (Williamson 2017: x-xi), I am keen to ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25), uncover ideologies, commercial and political agendas (Williamson 2017: 3) and understand the ‘algorithmic life’ (Amoore and Piotukh 2015) and ‘algorithmic culture’ (Striphas 2015) involved.

During my ‘algorithmic play’ with Coursera, its “transformational” “learning experiences” and self-directed predefined ‘learning plans’ perhaps exemplify Biesta’s (2005) ‘learnification’. Since ‘algorithms are inevitably modelled on visions of the social world’ (Beer 2017: 4), suggesting education needs “transforming” and (implied through Coursera’s dominance of “tech courses”) ‘the solution is in the hands of software developers’ (Williamson 2017: 3) exposes a ‘technological solutionism’ (Morozov 2013) and Californian ideology (Barbrook and Cameron 1995) common to many algorithms entangled in my lifestream. Moreover, these data-intensive practices and interventions, tending towards ‘machine behaviourism’ (Knox et al. 2020), could profoundly shape notions of learning and teaching.

As I consider questions of power with regards to algorithmic systems (Beer 2017: 11) and the possibilities for resistance, educational institutions accept commercial “EdTech solutions” designed to “rescue” them during the coronavirus crisis. This accelerated ‘datafication’ of education, seen in context of wider neoliberal agendas, highlights a growing urgency to critically examine changes to pedagogy, assessment and curriculum (Williamson 2017: 6).

However, issues of authorship, responsibility and agency are complex, for algorithmic systems are works of ‘collective authorship’, ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver 2013: 8-10). As ‘processes of “datafication” continue to expand and…data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4), I return to the concept of ‘feedback loops’ questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2). If human-machinic boundaries are blurred and autonomous will problematic (ibid.: 288), we might consider algorithmic systems/actions in terms of ‘human-machinic cognitive relations’ (Amoore 2019: 7) or ‘cognitive assemblages’ (Hayles 2017), entangled intra-relations seen in context of sociomaterial assemblages and performative in nature (Barad 2007; Introna 2016; Butler 1990) – an ‘entanglement of agencies’ (Knox 2015).

I close with an audio/visual snippet and a soundtrack to my EDC journey

 

My EDC soundtrack:

My EDC soundtrack cover image


View references

Resisting through alternative spaces and practices: some initial thoughts

Image by John Hain from Pixabay
Image by John Hain (Pixabay)

‘If politics is about struggles for power, then part of the struggle is to name digital technologies as a power relation and create alternative technology and practices to create new spaces for citizens to encounter each other to struggle for equality and justice.’ (Emejulu and McGregor 2016: 13)

Much of this block on algorithmic cultures has involved us examining the algorithmic systems and cultures at play in our lives, ‘unpack[ing] the full socio-technical assemblage’ (Kitchin 2017: 25) and uncovering ideologies, commercial and political agendas (Willamson 2017: 3).

As I explore power and (the notion of) the algorithm (Beer 2017), and the complexities of agency with regards to these entangled human-machinic relations (Knox 2015; Amoore 2019; Hayles 1999), and its implications for my lifestream here and education in general, I increasingly wonder what form resistance to this might take.

With this mind, I have gathered some rough notes and links below into the form of a reading list of sorts; this is something I hope to explore in the future.

Collating a reading list…

Protest and resistance

A Guide for Resisting Edtech: the Case against Turnitin (Morris and Stommel 2017)

‘Students protest Zuckerberg-backed digital learning program and ask him: “What gives you this right?”‘ (Washington Post 2018)

Why parents and students are protesting an online learning program backed by Mark Zuckerberg and Facebook (Washington Post 2018)

Brooklyn students hold walkout in protest of Facebook-designed online program (New York Post 2018)

Hope (2005) outlines several case studies in which school students have resisted surveillance.

Tanczer et al. (2016) outline various ways in which researchers might resist surveillance, such as using “The Onion Router” or tor (although the tor project website may be blocked for some).

Noiszy offers a browser extension which aims to mislead algorithmic systems by filling sites of your choosing them with “noise”, or “meaningless data”.

#RIPTwitter hashtag, often used to resist changes to Twitter’s algorithmic systems. Noticed earlier in the course, when considering changes to algorithms that may have an effect on my social media timelines. See DeVito et al. (2017).

‘Integrated & Alone: The Use of Hashtags in Twitter Social Activism’ (Simpson 2018). Examines viral hashtags associated with social movements: #metoo, #takeaknee, #blacklivesmatter.

Alternative models (such as platform cooperativism, participatory democracy/design and inclusive codesign)

‘There are people behind big data – not just data scientists, but software developers and algorithm designers, as well as the political, scientific and economic actors who seek to develop and utilise big data systems for their diverse purposes. And big data is also about the people who constitute it, whose lives are recorded both individually and at massive population scale. Big data, in other words, is simultaneously technical and social.’ (Williamson 2017: x-xi)

What/who might we resist? Surveillance, ‘datafication’ and data-intensive practices that discriminate against the marginalised (Noble 2018; Knox et al. 2020)? ‘Learnification’, neoliberal ideologies and the marketisation of education (Biesta 2005)? “Technological solutionism” (Morozov 2011; 2013)?

Looking a little deeper into the models and processes those following the “Silicon Valley” model often evangelise about, reveals the Agile Manifesto (Beck et al. 2001), written by what appears to be an group (calling themselves ‘The Agile Alliance’ and described as ‘organizational anarchists’) of people meeting at a Utah-based ski resort in 2001.

David Beer (2017: 4) argues that ‘algorithms are inevitably modelled on visions of the social world, and with outcomes in mind, outcomes influenced by commercial or other interests and agendas.’ Yet can we imagine a future where these agendas – rather than based on a Silicon Valley ethos, or as Barbrook and Cameron (1995) call it, Californian ideology (thank you JB Falisse for introducing me to this) – are rooted in cooperative or participatory principles?

‘Cooperative creativity and participatory democracy should be extended from the virtual world into all areas of life. This time, the new stage of growth must be a new civilisation.’ Imaginary Futures book (Barbrook 2007)

Alternative business models rooted in democracy, such as platform cooperativism (see Scholz and Schneider 2016).

Alternative design processes, rooted in participation and inclusivity, such as participatory design (see Beck 2002) and inclusive codesign:

‘The importance of inclusive codesign has been one of the central insights for us. Codesign is the opposite of masculine Silicon Valley “waterfall model of software design,” which means that you build a platform and then reach out to potential users. We follow a more feminine approach to building platforms where the people who are meant to populate the platform are part of building it from the very first day. We also design for outliers: disabled people and other people on the margins who don’t fit into the cookie-cutter notions of software design of Silicon Valley.’ (P2P Foundation 2017)

What might this look like in the context of education?

Can Code Schools Go Cooperative? (Gregory 2016)

Looking into critical pedagogy (see Freire 2014 [1970]; Freire 2016 [2004]; Stommel 2014), and If bell hooks Made an LMS: Grades, Radical Openness, and Domain of One’s Own (Stommel 2017).

As we see signs that “EdTech” companies stand to potentially gain from or exploit the current coronavirus crisis…

…it is ever more urgent to consider the potential significant effects that these “EdTech solutions” may have on educational policy and pedagogies (Williamson 2017: 6). As Williamson (2020) writes:

‘Emergency edtech eventually won’t be needed to help educators and students through the pandemic. But for the edtech industry, education has always been fabricated as a site of crisis and emergency anyway. An ‘education is broken, tech can fix it’ narrative can be traced back decades. The current pandemic is being used as an experimental opportunity for edtech to demonstrate its benefits not just in an emergency, but as a normal mode of education into the future.’ (Williamson 2020)


View references

My EDC soundtrack

My EDC soundtrack cover image
‘My EDC soundtrack’ cover artwork

Here is a soundtrack to mark my journey through EDC 2020…

Liner notes

The tracks on my playlist happen to be almost exclusively without lyrics (with the odd exception), with a number of relatively long tracks over ten minutes. When reading and writing, I often find lyrics distracting (unless I know them well and can “zone out”), and quick changes to track and to pace can throw me off.

After becoming entangled with my lifestream feeds, the opening tracks reflect the utopian/dystopian oppositions we discussed in the cybercultures block, and how the distinction made between human and machine is in fact Blurred – in contrast to the dualisms we often came across during the course. Thus, the ‘boundaries of the autonomous subject’ and concept of ‘autonomous will’ is problematic (Hayles 1999).

Moving onto the community cultures block, and exploring the connectivist-informed ds106, inspired my next choice – Everything Connected by Jon Hopkins. Questioning the open/closed false binary, that “open” is not necessarily “inclusive” (Collier and Ross 2017: 8-9) and the way I sometimes felt feelings of seclusion even in “open” spaces influenced the following few tracks.

Exploring algorithmic cultures in our final block, I listened to Dan Tepfer who experiments with algorithmic processes in his compositions, such as Fractal Tree. The idea of fractals and feedback loops continued to fascinate me as I reflected again on Hayles’ (1999: 2). In particular, her discussion of feedback loops (in the context of the cyborg) problematising the boundaries of the autonomous subject prompted me to reflect on the complexities of considering agency in the context of algorithmic systems. Fractals and feedback loops were the inspiration for the next few tracks, as well as for my lifestream-blog header image – the Mandelbrot set, a visualisation created through feedback and iteration.

In collating my final summary posts, and beginning to write my assignment, the writing process was a messy one. Often I found myself writing improvising ideas onto a page, but getting stuck (demonstrated through a “false start” on Miles Davis’ Freddie Freeloader). Yet, inspired by the quote often attributed to Miles Davis – ‘do not fear mistakes – there are none’ – I tried to embrace dead ends as part of the process rather than “mistakes”. In the end, I found myself needing quiet time to reflect, as demonstrated through John Cage’s 4’33 and a number of tracks centred around themes of silence and peace. Once I had collected my thoughts, I would make use of The Eraser and return to Page One in order to move forward.

Finally, I found myself thinking about the issues of resistance (to surveillance, to datafication, to commercial and political agendas entangled with algorithmic processes in education) – and listening to The Protest by Flying Lotus – but all too suddenly found myself at the end of the lifestream, wishing farewell and disentangling myself from my feeds…

Cover artwork credits


View references

Reflecting back on EDC 2020

As I come to the end of the lifestream blog, I return to questions and aspects I considered early on in the course…

EDC week 3
EDC week 3 (enlarge)

‘Entanglement’ has been a key theme throughout the course – the entangled ‘boundaries of the autonomous subject’ (Hayles 1999: 2), reconsidering the dualisms and false binaries which have increasingly appeared entangled – human/machine, real/virtual, open/closed, public/private and so on…

Dualisms visual artefact
Dualisms visual artefact

Rather than assume I could remain an impartial observer, I became entangled in my “object” of research – the ds106 communities and networks

Miro board
Miro board

Finally, my ‘algorithmic play’ artefact – rather than examine standalone and discrete “black boxes” of code, instead revealed ‘massive, networked ones with hundreds of hands reaching into them’ (Seaver 2013: 10)…

Photo by Federico Beccari on Unsplash
PhotoFederico BeccariUnsplash.

…multiple codebases entangled with one another, with collective authorship but tangled up with a “Silicon Valley ethos”, commercial interests, neoliberal ideologies and specific notions of “progress”…

‘Algorithmic systems: entanglements of human-machinic relations’
‘Algorithmic systems: entanglements of human-machinic relations’

…a messy and complex entanglement of multiple algorithmic systems and human-machinic cognitive relations (Amoore 2019: 7), algorithmic systems with ‘a cultural presence’ (Beer 2017: 11), both modelled on and influenced by ‘visions of the social world’ (ibid: 4).

Contemplating on how we might think about agency in this context encourages me to consider the ‘broader debates about the status of agency as processes of “datafication” continue to expand and as data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4).

This “feeding back” of data into people’s lives in turn brings me back to the concept of ‘feedback loops’ which question the ‘boundaries of the autonomous subject’ (Hayles 1999: 2) and loops me back to the beginning of the course, when I put up my lifestream blog header image (the Mandelbrot set), a visualisation itself created through feedback and iteration

The Mandelbrot set
The Mandelbrot set (created by Wolfgang Beyer with Ultra Fractal, CC BY-SA 3.0).

View references

Michael commented on Adrienne O Mahoney’s EDC lifestream – ‘Algorithmic Play Artefact’

Algorithmic Play Artefact

Michael Wolfindale:

Great artefact, Adrienne, and really like the annotated screencast format – works brilliantly with the subject matter!

Fascinating (and perhaps sobering!) how you pinned down aspects which were influenced by activity outside of your Instagram account, such as your search history. Reminds me of a quote from Seaver (2013: 10) I came across which builds on the “black box metaphor” to argue that ‘these algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them’:

Seaver, N., 2013. Knowing Algorithms. Presented at the Media in Transitions 8, Cambridge, MA. Available from: https://static1.squarespace.com/static/55eb004ee4b0518639d59d9b/t/55ece1bfe4b030b2e8302e1e/1441587647177/seaverMiT8.pdf [Accessed 11 Mar 2020].

It’s great that you’ve discussed multiple algorithmic systems and the wider context in that sense since, as you also note, it is all too often the case that a handful of big tech companies own many of these different “apps” (such as Facebook owning Instagram). As you draw on from Williamson (2017), the business models, entrepreneurial cultures and commercial and political agendas are all a huge factor here. The Silicon Valley model and associated ideologies are aspects that also came to the fore in my brief “play” with Coursera. Really clear and thorough conclusions here!

Great work – really enjoyed it!

Michael saved in Pocket: ‘Neuroliberalism: Behavioural Government in the Twenty First Century’ (Whitehead et al. 2017)

Description

Many governments in the developed world can now best be described as ‘neuroliberal’: having a combination of neoliberal principles with policy initiatives derived from insights in the behavioural sciences.

Neuroliberalism presents the results of the first critical global study of the impacts of the behavioural sciences on public policy and government actions, including behavioural economics, behavioural psychology and neuroeconomics. Drawing on interviews with leading behaviour change experts, organizations and policy-makers, and discussed in alignment with a series of international case studies, this volume provides a critical analysis of the ethical, economic, political and constitutional implications of behaviourally oriented government. It explores the impacts of the behavioural sciences on everyday life through a series of themes, including: understandings of the human subject; interpretations of freedom; the changing form and function of the state; the changing role of the corporation in society; and the design of everyday environments and technologies.

The research presented in this volume reveals a diverse set of neuroliberal approaches to government that offer policy-makers and behaviour change professionals a real choice in relation to the systems of behavioural government they can implement. This book also argues that the behavioural sciences have the potential to support much more effective systems of government, but also generate new ethical concerns that policy-makers should be aware of.

View book preview

Michael saved in Pocket: ‘No such thing as society? Liberal paternalism, politics of expertise, and the Corona crisis’ (Bacevic 2020)

Excerpt

‘Despite family resemblances with its neoliberal predecessor, the Government’s strategy is supposedly informed by a slightly different ideology – liberal paternalism, known as Nudge’, which gained notoriety after being enlisted by Blair, Cameron, and Obama administrations to advise on a range of public services. As a strategy of governance, ‘nudge’ draws on behavioural economics, a broadly heterodox approach that emphasizes limits to rational choice theories in understanding social dynamics. Three of its proponents – Daniel Kahneman, Robert Shiller, and Richard Thaler – were awarded Nobel Prizes, respectively in 2002, 2013 and 2017, but ‘Nudge’s’ most famous advocate is probably Cass Sunstein, an American legal scholar who led the White House Office of Information and Regulatory Affairs between 2009 and 2012.’ (Bacevic 2020)

View full article

Michael saved in Pocket: ‘Neuroliberalism: Cognition, context, and the geographical bounding of rationality’ (Whitehead et al. 2018)

Abstract

Focusing on the rise of the behavioural sciences within the design and implementation of public policy, this paper introduces the concept of neuroliberalism and suggests that it could offer a creative context within which to interpret related governmental developments. Understanding neuroliberalism as a system of government that targets the more-than-rational aspects of human behaviour, this paper considers the particular contribution that geographical theories of context and spatial representation can make to a critical analysis of this evolving governmental project.

View full article

Michael commented on Susanne MacLeod’s EDC lifestream – ‘Algorithmic play’

Algorithmic play

Michael Wolfindale:

Great artefact, Susanne, and really like the scrolling story/presentation style – very appropriate to the endless scrolling we often do on social media!

Interesting to see the kids’ videos appearing, and that you’ve tracked it down to perhaps what was an algorithm change. Also, you reflect on the apparent random nature of some of the results. This all appears to speak to the ‘emergent and constantly unfolding’ (and sometimes random) nature of algorithms that Kitchin (2017: 21) discusses.

As you point out, the non-transparent “black boxed” algorithms, obscured by technical aspects inaccessible for many and further complicated by the messy network of different connections, inputs and reactions – not to mention the surrounding social aspects, commercial agendas, ideologies and so on – makes it very difficult to research algorithmic systems (Kitchin 2017: 21).

Your artefact really highlighted this, and has been a thought provoking and reflective piece of work – thank you!