Disentanglement from my lifestream: wrapping up algorithmic cultures and EDC 2020

Entanglement
‘Entanglement’ (ellen x silverberg, Flickr)

As I disentangle myself from my lifestream feeds, and reflect on the course, I consider how I have perceived and been influenced by the algorithmic systems involved.

Google and Twitter were consistent influences, the latter through new/existing connections and via #mscedc#AlgorithmsForHer and #ds106, and I saved/favourited (often highly ranked) resources to Pocket, YouTube and SoundCloud (and other feeds).

While I had some awareness of these algorithms, alterations to my perception of the ‘notion of an algorithm’ (Beer 2017: 7) has shaped my behaviour. Believing I “understand” how Google “works”, reading about the Twitter algorithm and reflecting on ranking/ordering have altered my perceptions, and reading about ‘learning as “nudging”‘ (Knox et al. 2020: 38) made me think twice before accepting the limiting recommendations presented to me.

Referring to the readings, these algorithmic operations are interwoven with, and cannot be separated from, the social context, in terms of commercial interests involved in their design and production, how they are ‘lived with’ and the way this recursively informs their design (Beer 2017: 4). Furthermore, our identities shape media but media also shapes our identities (Pariser 2011). Since ‘there are people behind big data’ (Williamson 2017: x-xi), I am keen to ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25), uncover ideologies, commercial and political agendas (Williamson 2017: 3) and understand the ‘algorithmic life’ (Amoore and Piotukh 2015) and ‘algorithmic culture’ (Striphas 2015) involved.

During my ‘algorithmic play’ with Coursera, its “transformational” “learning experiences” and self-directed predefined ‘learning plans’ perhaps exemplify Biesta’s (2005) ‘learnification’. Since ‘algorithms are inevitably modelled on visions of the social world’ (Beer 2017: 4), suggesting education needs “transforming” and (implied through Coursera’s dominance of “tech courses”) ‘the solution is in the hands of software developers’ (Williamson 2017: 3) exposes a ‘technological solutionism’ (Morozov 2013) and Californian ideology (Barbrook and Cameron 1995) common to many algorithms entangled in my lifestream. Moreover, these data-intensive practices and interventions, tending towards ‘machine behaviourism’ (Knox et al. 2020), could profoundly shape notions of learning and teaching.

As I consider questions of power with regards to algorithmic systems (Beer 2017: 11) and the possibilities for resistance, educational institutions accept commercial “EdTech solutions” designed to “rescue” them during the coronavirus crisis. This accelerated ‘datafication’ of education, seen in context of wider neoliberal agendas, highlights a growing urgency to critically examine changes to pedagogy, assessment and curriculum (Williamson 2017: 6).

However, issues of authorship, responsibility and agency are complex, for algorithmic systems are works of ‘collective authorship’, ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver 2013: 8-10). As ‘processes of “datafication” continue to expand and…data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4), I return to the concept of ‘feedback loops’ questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2). If human-machinic boundaries are blurred and autonomous will problematic (ibid.: 288), we might consider algorithmic systems/actions in terms of ‘human-machinic cognitive relations’ (Amoore 2019: 7) or ‘cognitive assemblages’ (Hayles 2017), entangled intra-relations seen in context of sociomaterial assemblages and performative in nature (Barad 2007; Introna 2016; Butler 1990) – an ‘entanglement of agencies’ (Knox 2015).

I close with an audio/visual snippet and a soundtrack to my EDC journey

 

My EDC soundtrack:

My EDC soundtrack cover image


View references

Resisting through alternative spaces and practices: some initial thoughts

Image by John Hain from Pixabay
Image by John Hain (Pixabay)

‘If politics is about struggles for power, then part of the struggle is to name digital technologies as a power relation and create alternative technology and practices to create new spaces for citizens to encounter each other to struggle for equality and justice.’ (Emejulu and McGregor 2016: 13)

Much of this block on algorithmic cultures has involved us examining the algorithmic systems and cultures at play in our lives, ‘unpack[ing] the full socio-technical assemblage’ (Kitchin 2017: 25) and uncovering ideologies, commercial and political agendas (Willamson 2017: 3).

As I explore power and (the notion of) the algorithm (Beer 2017), and the complexities of agency with regards to these entangled human-machinic relations (Knox 2015; Amoore 2019; Hayles 1999), and its implications for my lifestream here and education in general, I increasingly wonder what form resistance to this might take.

With this mind, I have gathered some rough notes and links below into the form of a reading list of sorts; this is something I hope to explore in the future.

Collating a reading list…

Protest and resistance

A Guide for Resisting Edtech: the Case against Turnitin (Morris and Stommel 2017)

‘Students protest Zuckerberg-backed digital learning program and ask him: “What gives you this right?”‘ (Washington Post 2018)

Why parents and students are protesting an online learning program backed by Mark Zuckerberg and Facebook (Washington Post 2018)

Brooklyn students hold walkout in protest of Facebook-designed online program (New York Post 2018)

Hope (2005) outlines several case studies in which school students have resisted surveillance.

Tanczer et al. (2016) outline various ways in which researchers might resist surveillance, such as using “The Onion Router” or tor (although the tor project website may be blocked for some).

Noiszy offers a browser extension which aims to mislead algorithmic systems by filling sites of your choosing them with “noise”, or “meaningless data”.

#RIPTwitter hashtag, often used to resist changes to Twitter’s algorithmic systems. Noticed earlier in the course, when considering changes to algorithms that may have an effect on my social media timelines. See DeVito et al. (2017).

‘Integrated & Alone: The Use of Hashtags in Twitter Social Activism’ (Simpson 2018). Examines viral hashtags associated with social movements: #metoo, #takeaknee, #blacklivesmatter.

Alternative models (such as platform cooperativism, participatory democracy/design and inclusive codesign)

‘There are people behind big data – not just data scientists, but software developers and algorithm designers, as well as the political, scientific and economic actors who seek to develop and utilise big data systems for their diverse purposes. And big data is also about the people who constitute it, whose lives are recorded both individually and at massive population scale. Big data, in other words, is simultaneously technical and social.’ (Williamson 2017: x-xi)

What/who might we resist? Surveillance, ‘datafication’ and data-intensive practices that discriminate against the marginalised (Noble 2018; Knox et al. 2020)? ‘Learnification’, neoliberal ideologies and the marketisation of education (Biesta 2005)? “Technological solutionism” (Morozov 2011; 2013)?

Looking a little deeper into the models and processes those following the “Silicon Valley” model often evangelise about, reveals the Agile Manifesto (Beck et al. 2001), written by what appears to be an group (calling themselves ‘The Agile Alliance’ and described as ‘organizational anarchists’) of people meeting at a Utah-based ski resort in 2001.

David Beer (2017: 4) argues that ‘algorithms are inevitably modelled on visions of the social world, and with outcomes in mind, outcomes influenced by commercial or other interests and agendas.’ Yet can we imagine a future where these agendas – rather than based on a Silicon Valley ethos, or as Barbrook and Cameron (1995) call it, Californian ideology (thank you JB Falisse for introducing me to this) – are rooted in cooperative or participatory principles?

‘Cooperative creativity and participatory democracy should be extended from the virtual world into all areas of life. This time, the new stage of growth must be a new civilisation.’ Imaginary Futures book (Barbrook 2007)

Alternative business models rooted in democracy, such as platform cooperativism (see Scholz and Schneider 2016).

Alternative design processes, rooted in participation and inclusivity, such as participatory design (see Beck 2002) and inclusive codesign:

‘The importance of inclusive codesign has been one of the central insights for us. Codesign is the opposite of masculine Silicon Valley “waterfall model of software design,” which means that you build a platform and then reach out to potential users. We follow a more feminine approach to building platforms where the people who are meant to populate the platform are part of building it from the very first day. We also design for outliers: disabled people and other people on the margins who don’t fit into the cookie-cutter notions of software design of Silicon Valley.’ (P2P Foundation 2017)

What might this look like in the context of education?

Can Code Schools Go Cooperative? (Gregory 2016)

Looking into critical pedagogy (see Freire 2014 [1970]; Freire 2016 [2004]; Stommel 2014), and If bell hooks Made an LMS: Grades, Radical Openness, and Domain of One’s Own (Stommel 2017).

As we see signs that “EdTech” companies stand to potentially gain from or exploit the current coronavirus crisis…

…it is ever more urgent to consider the potential significant effects that these “EdTech solutions” may have on educational policy and pedagogies (Williamson 2017: 6). As Williamson (2020) writes:

‘Emergency edtech eventually won’t be needed to help educators and students through the pandemic. But for the edtech industry, education has always been fabricated as a site of crisis and emergency anyway. An ‘education is broken, tech can fix it’ narrative can be traced back decades. The current pandemic is being used as an experimental opportunity for edtech to demonstrate its benefits not just in an emergency, but as a normal mode of education into the future.’ (Williamson 2020)


View references

Michael saved in Pocket: ‘Noiszy’

Interesting Chrome extension which aims to resist those recording our “digital tracks” by algorithmically browsing selected sites in a “pseudo-random” fashion and creating noise and confusion…

Excerpt

You are being tracked.

Whatever you do online, you leave digital tracks behind.

These digital footprints are used to market to you – and to influence your thinking and behavior.

On April 3, President Donald Trump signed a repeal of online privacy rules that would have limited the ability of ISPs to share or sell customers’ browsing history for advertising purposes.

Erasing these footprints – or not leaving them in the first place – is becoming more difficult, and less effective.

Hiding from data collection isn’t working.

Instead, we can make our collected data less actionable by leaving misleading tracks, camouflaging our true behavior.

We can resist being manipulated by making ourselves harder to analyze – both individually, and collectively.

We can take back the power of our data.

View full article