Disentanglement from my lifestream: wrapping up algorithmic cultures and EDC 2020

Entanglement
‘Entanglement’ (ellen x silverberg, Flickr)

As I disentangle myself from my lifestream feeds, and reflect on the course, I consider how I have perceived and been influenced by the algorithmic systems involved.

Google and Twitter were consistent influences, the latter through new/existing connections and via #mscedc#AlgorithmsForHer and #ds106, and I saved/favourited (often highly ranked) resources to Pocket, YouTube and SoundCloud (and other feeds).

While I had some awareness of these algorithms, alterations to my perception of the ‘notion of an algorithm’ (Beer 2017: 7) has shaped my behaviour. Believing I “understand” how Google “works”, reading about the Twitter algorithm and reflecting on ranking/ordering have altered my perceptions, and reading about ‘learning as “nudging”‘ (Knox et al. 2020: 38) made me think twice before accepting the limiting recommendations presented to me.

Referring to the readings, these algorithmic operations are interwoven with, and cannot be separated from, the social context, in terms of commercial interests involved in their design and production, how they are ‘lived with’ and the way this recursively informs their design (Beer 2017: 4). Furthermore, our identities shape media but media also shapes our identities (Pariser 2011). Since ‘there are people behind big data’ (Williamson 2017: x-xi), I am keen to ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25), uncover ideologies, commercial and political agendas (Williamson 2017: 3) and understand the ‘algorithmic life’ (Amoore and Piotukh 2015) and ‘algorithmic culture’ (Striphas 2015) involved.

During my ‘algorithmic play’ with Coursera, its “transformational” “learning experiences” and self-directed predefined ‘learning plans’ perhaps exemplify Biesta’s (2005) ‘learnification’. Since ‘algorithms are inevitably modelled on visions of the social world’ (Beer 2017: 4), suggesting education needs “transforming” and (implied through Coursera’s dominance of “tech courses”) ‘the solution is in the hands of software developers’ (Williamson 2017: 3) exposes a ‘technological solutionism’ (Morozov 2013) and Californian ideology (Barbrook and Cameron 1995) common to many algorithms entangled in my lifestream. Moreover, these data-intensive practices and interventions, tending towards ‘machine behaviourism’ (Knox et al. 2020), could profoundly shape notions of learning and teaching.

As I consider questions of power with regards to algorithmic systems (Beer 2017: 11) and the possibilities for resistance, educational institutions accept commercial “EdTech solutions” designed to “rescue” them during the coronavirus crisis. This accelerated ‘datafication’ of education, seen in context of wider neoliberal agendas, highlights a growing urgency to critically examine changes to pedagogy, assessment and curriculum (Williamson 2017: 6).

However, issues of authorship, responsibility and agency are complex, for algorithmic systems are works of ‘collective authorship’, ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver 2013: 8-10). As ‘processes of “datafication” continue to expand and…data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4), I return to the concept of ‘feedback loops’ questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2). If human-machinic boundaries are blurred and autonomous will problematic (ibid.: 288), we might consider algorithmic systems/actions in terms of ‘human-machinic cognitive relations’ (Amoore 2019: 7) or ‘cognitive assemblages’ (Hayles 2017), entangled intra-relations seen in context of sociomaterial assemblages and performative in nature (Barad 2007; Introna 2016; Butler 1990) – an ‘entanglement of agencies’ (Knox 2015).

I close with an audio/visual snippet and a soundtrack to my EDC journey

 

My EDC soundtrack:

My EDC soundtrack cover image


View references

Week nine and algorithmic systems: unpacking the socio-technical assemblage

Photo by Federico Beccari on Unsplash
Algorithmic systems as ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver: 2013: 10)? (PhotoFederico BeccariUnsplash.)
Having just published my artefact Algorithmic systems: entanglements of human-machinic relations, I have started commenting on others’ artefacts, finding insightful (but sobering) explorations of the algorithmic systems of FacebookNetflix and others. Facebook’s foray into ‘personalized learning platforms’ and the resistance against it is very relevant as I reflect on the issues of surveillance and automation in education (Williamson 2019).

While focusing on Coursera for my artefact, I observed inclusions/exclusions, tendencies to privilege “tech” courses from Western institutions, and experienced the complexities of researching algorithmic systems (Kitchin 2017). Does Silicon Valley-based Coursera – with its vision of life transformation for all – subscribe to Silicon Valley’s notion of ‘progress’ while blind to its exclusions? Is it another example of an attempt to ‘revolutionise’ education, as Williamson (2017) details, framed by neoliberal and commercial agendas yet presented as an objective and universal ‘learning experience’?

I reflect on the ‘black box’ metaphor, yet ‘algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them’ (Seaver 2013: 10) and we ‘need to examine the work that is done by those modelling and coding [them]’ (Beer 2017: 10). Thus, rather than stripping individual algorithms of their wider social and political context, we should ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25) and examine the complex ‘human-machinic cognitive relations’ (Amoore 2019: 7), ‘entanglement of agencies’ (Knox 2015) and the implications for education in an era of ‘datafication’ (Knox et al. 2020).


View references

Week eight: critically playing with algorithms

This week, while commenting on micro-ethnographies, I began the ‘algorithmic play’ activity, adding new lifestream feeds including Vimeo and Deezer and, inspired by ‘Show and Tell: Algorithmic Culture’ (Sandvig 2014) and Noble’s (2018) Algorithms of Oppression, played with Google search algorithms including their autocomplete

'Is edtech...' Google autocomplete‘Is edtech…’ Google autocomplete

I also discovered (some) of what Google “knows” about me, collected ‘algorithmic play’ notes/screenshots and recorded algorithm ‘recommendations’ from…

I reflect on questions from Amoore (2019: 7)

‘Do algorithms compute beyond the threshold of human perceptibility and consciousness? Can ‘cognizing’and ‘learning’ digital devices reflect or engage the durational experience of time? Do digital forms of cognition radically transform workings of the human brain and what humans can perceive or decide? How do algorithms act upon other algorithms, and how might we understand their recursive learning from each other? What kind of sociality or associative life emerges from the human-machinic cognitive relations that we see with association rules and analytics?’

…and, as I explore these ‘human-machinic cognitive relations’, look beyond the polished “app” user interfaces and reflect on how algorithms (despite how they are presented) are far from objective or neutral (Kitchin 2017: 18). I turn my attention to investigating discrimination and bias (Noble 2018 and #AlgorithmsForHer)…

I also investigate the notion of ‘data colonialism’ (Knox 2016: 14), rethink the relation between algorithms and power (Beer 2017) and look to the future of what this might all mean in an educational context (Knox et al. 2020; Williamson 2017).


View references

Week seven: Researching communities…interactions between entities or entangled intra-relations?

As we conclude our block on community cultures, and I post my micro-ethnography artefact Entangled Communities, many questions/issues have been raised.

Inspired by David Yeats’ artefact grappling with a community apparently “present” but “hidden”, I pondered on how/whether this might be tracked and issues of surveillance that link to our next algorithmic cultures block. His artefact also asks ‘what is community?‘, and I wondered how we might define it…

  • a ‘creative “gathering”‘ (Bayne 2015b: 456) around a ‘shared domain of interest’ (Wenger 1998; Lave and Wenger 1991)?
  • a feeling ‘produced by more-than-human assemblages’ (Hickey and Moody 2019: 2)?

While researching, should we focus on a network of ‘connections between entities’ (Siemens 2005) or on agential relations and ‘intra-actions’ where agency is co-constitued (Barad 2007; Hickey and Moody 2019: 4-5)?

As I constructed/traversed a network of connections (Downes 2017) in the connectivist-informed ds106, “I” and “my study” (including my field notes) became “entangled” in the course/community I was studying and my artefact itself appeared increasingly like a tangled network map of connectionsI noted the course/community boundaries blurring and the traditional MOOC form questioned.

Entangled Communities
Entangled Communities

Questioning my research methods, I explored various approaches including the speculative method (Ross 2017)

…rather than an “observer” collecting data about something “out there”, are researchers entangled with the “object” of research where data generated/collected ‘is co-created by the fieldwork assemblage’ (Hickey-Moody and Willcox 2019: 5)?

Finally, as I listened to ds106 radiois sound a ‘vibrational event’, and listening an embodied experience (Ceraso 2018)?

On that note, I’m experimenting with a short audio snippet to conclude:


View references

Week six: ‘Community’ as networks and entanglements

Entanglements
Entangled communities? (Photo by Noor SethiUnsplash).

As I become entangled in the ds106 community, while building my micro-ethnographic artefact, I reflect upon how the vast complex ds106 community consists of numerous overlapping/entangled networks or “micro-communities”.

#ds106 Twitter hashtag word clouds (SocioViz)

Hashtag word cloud (04-02-2020 to 10-02-2020)
04-02-2020 to 10-02-2020
Hashtag word cloud (07-02-2020 to 13-02-2020)
07-02-2020 to 13-02-2020

“Micro-communities” seem grouped around ‘central consumption’ activities (Kozinets 2010: 31), like assignments/challenges, occurring in different online spaces (Twitter, blogs, ds106radio etc.) and co-existing in physical on-campus spaces. Might this exemplify the blurred boundaries between ‘virtual’ and ‘real’ (Hickey-Moody and Willcox 2019)?

You might also view ds106 as a community of practice (Lave and Wenger 1991; Wenger 1998), whereby people with a shared domain of interest participate in and construct an identity around the community.

My involvement as a lurker/listener or ‘newbie’ (Kozinets 2010) has largely involved posting ‘within’ the ds106 flow, without comments from others, and have felt the distinction between my ‘open participant’ status and ‘core’ university students (and perhaps secluded?). However, I have commented on others’ blog posts and, as my confidence grows, started to branch out to Twitter, and connect with related communities/hashtags.

Considering ethical issues, I have taken care to be clear I am carrying out a small study and to anonymise quotes (Fournier et al. 2014: 3). As danah boyd (2014: 57) says, ‘there’s a big difference between being in public and being public’.

Finally, as I become entangled in ds106, I reflect on Hickey-Moody and Willcox (2019) who, drawing on Barad (2007), acknowledge their entanglement with what they are researching, and argue more-than-human assemblages produce feelings of ‘community’ and ‘belonging’.


View references

Focusing my micro-ethnography on ds106 (‘community’) radio during week five

Reflecting on Karen Barad’s (2003; 2007) agential realism and onto-epistemology, where the “thing” is entangled with the way in which “we” research it, I have found myself questioning how I might research my micro-ethnography and how/whether I should participate (as a ‘lurker‘ or otherwise). How might different kinds of participation affect ‘community’ and the ethical issues surrounding the study?

In my role as ‘open participant‘, having ‘access’ to read/listen/participate in, and feed into, the same activities/assignments as those studying the course through a degree, the binaries between ‘open’/’closed’, ‘insider’/’outsider’, ‘included’/’excluded’ appear blurred and problematic. Is access alone enough to be ‘included’?

Listening to Tim Ingold’s assertion that ‘we don’t make studies of people, we study with them and learn from them’, this week I submitted a radio bumper into the ‘ds106 flow’ alongside the work of students/open participants, with the potential of receiving “airtime” on ds106radio. Is this an example of the kind of entanglement Barad refers to?

Inspired by an article on live field notes, I wrote some field notes of my own, and began focusing my micro-ethnography on ds106radio and the interactions surrounding it

What makes ‘community’ endure in a connectivist-informed course such as ds106, often beyond the end date (“#4life“)?

How might we define/understand/documentcommunity‘? What role might ds106radio, and sound in general, play?

As I continue my micro-ethnography, and refer to relevant literature and examples, I uncover new questions, as suggested by danah boyd (2008: 29), and consider the communities and relations in these distributed educational spaces.

Week four and community cultures: exploring the ‘open’ and ‘closed’ (false) binary

Dualisms visual artefact
A ‘creative “gathering”‘? (Dualisms visual artefact)

Moving into our community cultures block, and preparing my micro-ethnography, how might we take a critical view on the relations between technologies and people? Could we imagine a ‘creative “gathering”‘? Might we envisage relations between technology and culture as ‘co-determining, co-constructive forces…a complex dance, an interweaving and intertwining’ (Kozinets 2010: 22)? Would an agential realist perspective (Barad 2003: 828) – where ‘there is no…exterior observational point’ and ‘we are part of the world in its ongoing intra-activity’ – encourage us to think differently about notions of ‘community’ and how we might explore it?

'Open'/'closed' binary
‘Open’/’closed’ binary

Building on the ‘inside’ and ‘outside’ binary touched upon in the Dualisms visual artefact, I am questioning the (false) binary between ‘open’ and ‘closed’ (Collier and Ross 2017: 8; Ross et al. 2019: 28). This is particularly pertinent, as I am looking to focus my micro-ethography on the ‘open course on digital storytelling’ ds106, joining as an ‘open participant’. The open course originates from (and follows) a Spring 2020 university course at the University of Mary Washington. Each student has a blog and weekly assignments, both public, and there are also ‘Daily Create‘ challenges and a ds106 radio; ‘open’ participants can engage in many aspects.

As I begin my micro-ethnography, I reflect on several suggestions from boyd (2009: 29), namely to read other ethnographies, and then to…

‘…begin by focusing on a culture. What defines that culture? Its practices? Its identity? Who are the relevant social groups? What are the relevant social dynamics? What boundaries are applicable?’ (boyd 2009: 29)


View references

Our third and final week on cyberculture

As we end our first block on cyberculture, it continues to strike me how many ideas about technology and education appear rooted in dualisms which tend to centre (a certain kind of) ‘human’, whilst othering the ‘digital’ (Knox 2015).

Binaries/dualisms
Binaries/dualisms (from week 2)

What kind of ‘human’, however, influences the design of ‘artificial intelligence’, and what assumptions may be baked into the algorithms that influence the choice of content we include in our lifestreams? Does this reproduce existing biases or privilege a certain view of ‘human’ ‘intelligence’? What might be the implications for education and learning analytics?

If ‘machines’ can ‘learn’, does the responsibility still lie with the programmer? If ‘distributed cognition replaces autonomous will’ (Hayles 1999: 288), should we instead think in terms of ‘cognitive assemblages’ and ‘nonconscious cognition’? Reflecting on this, I found an example of distributed cognition through slippingglimpse (Hayles 2008).

I continued this week to consider how technology is often visualised as a ‘tool’ or ‘enhancement’ (‘Ping Body’, Stelarc). Moving beyond technology ‘enhanced’ learning (Bayne 2015a), and towards a critical posthumanist view, can we imagine a view of education where the human subject is not separate nor central but the human and non-human are entangled in a ‘creative “gathering”’ (Bayne 2015b)? How might we visualise this?

Dualisms visual artefact
A “creative ‘gathering’”? (Dualisms visual artefact)

Finally, as use of the ‘cyber’ prefix has declined (Knox 2015), how might we think about the ‘digital’? What might a ‘postdigital‘ perspective mean for education (Knox 2019)? I continue to explore…

EDC week 3
EDC week 3 (enlarge)

View references

The end of our second week on cyberculture

Our second week continued with questions raised through films (including A New Hope and Cyborg) and books (Machines Like Me and Iain M. Banks’ series). Themes that particularly struck me include:

1) Assuming that ‘human’ is not an objective nor inclusive term (Braidotti 2013: 26), how might this affect how we think about ‘artificial intelligence’, power and agency?

2) If we take a ‘dynamic partnership between humans and intelligent machines’ (Hayles 1999: 288) as a point of departure, how might we consider concepts such as consciousness, (distributed) cognition and agency?

3) Can machines make ‘moral‘ decisions?

4) Building on a discussion about gender and ‘virtual’ identities, are we ‘performing’ or is it ‘performative‘? Should there be a distinction between ‘real’/’virtual’ here, and how do we define ‘real’? (The Matrix comes to mind here…) How might this play out in on our identities on Twitter, lifestream-blogs etc.?

5) Thinking beyond assumptions that the ‘human’ is at the centre of education, and technology is a ‘tool‘ or ‘enhancement‘, what are the implications of a complex entanglement of education and technology (Bayne 2015: 18) for this course?

Entanglement

Complex entanglement (‘Entanglement’, ellen x silverberg, Flickr)

Many discussions were via Twitter, drawing in questions from the public:

I have also been commenting on others’ lifestream-blogsbringing them in as feeds.

Following on from last week’s map, I have opened new and revisited old avenues:

EDC week 2
EDC week 2 (enlarge)

I have also experimented with visualisations of my feed ahead of our visual artefact task…

InfraNodus: Text network visualisation and discourse analysis (described as 'postsingularity thinking tool'
InfraNodus: Text network visualisation and discourse analysis (or ‘postsingularity thinking tool’)
Binaries/dualisms
Binaries/dualisms

View references

The end of our first week on cyberculture

Today marks the end of an exciting first week!

Before we started, I set up Twitter, YouTube, SoundCloud and Pocket feeds and shared resources on artifical intelligence embracing social science, machines and cognition, posthumanism and a track and article demonstrating music and algorithms. This mix was intended to test out different feeds and save/share content to revisit later.

I began the first day reflecting on a short clip from Blade Runner, referencing the ‘more human than human’ quote mentioned in the Miller (2011) reading. As I worked through the readings and films, contemplating on the figure of the ‘cyborg’ through Haraway, I reconsidered my assumptions about the boundaries between ‘human’ and ‘machine’. This theme kept cropping up, while looking at the Voight-Kampff test in Blade Runner and during a Twitter exchange about ‘testing’ for a ‘human’.

This ‘human’/‘machine’ boundary was just one assumption I found myself deconstructing, encouraged by Sterne (2006: 24) to question, examine and reclassify categories and boundaries and avoid importing existing biases. Thinking about ‘feedback loops’ and questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2) brought me to this video, and inspired my header image (the Mandelbrot set), a visualisation created through feedback and iteration. I went on to explore posthumanism through videos and readings from Braidotti and Hayles, reconsidering my ideas about autonomous will and the neutrality of the term ‘human’.

My journey this week had tangents (including the #AlgorithmsForHer conference) which I hope to revisit. I’ve sketched out my journey below…

EDC week 1
EDC week 1 (enlarge)

View references