Our third and final week on cyberculture

As we end our first block on cyberculture, it continues to strike me how many ideas about technology and education appear rooted in dualisms which tend to centre (a certain kind of) ‘human’, whilst othering the ‘digital’ (Knox 2015).

Binaries/dualisms (from week 2)

What kind of ‘human’, however, influences the design of ‘artificial intelligence’, and what assumptions may be baked into the algorithms that influence the choice of content we include in our lifestreams? Does this reproduce existing biases or privilege a certain view of ‘human’ ‘intelligence’? What might be the implications for education and learning analytics?

If ‘machines’ can ‘learn’, does the responsibility still lie with the programmer? If ‘distributed cognition replaces autonomous will’ (Hayles 1999: 288), should we instead think in terms of ‘cognitive assemblages’ and ‘nonconscious cognition’? Reflecting on this, I found an example of distributed cognition through slippingglimpse (Hayles 2008).

I continued this week to consider how technology is often visualised as a ‘tool’ or ‘enhancement’ (‘Ping Body’, Stelarc). Moving beyond technology ‘enhanced’ learning (Bayne 2015a), and towards a critical posthumanist view, can we imagine a view of education where the human subject is not separate nor central but the human and non-human are entangled in a ‘creative “gathering”’ (Bayne 2015b)? How might we visualise this?

Dualisms visual artefact
A “creative ‘gathering’”? (Dualisms visual artefact)

Finally, as use of the ‘cyber’ prefix has declined (Knox 2015), how might we think about the ‘digital’? What might a ‘postdigital‘ perspective mean for education (Knox 2019)? I continue to explore…

EDC week 3
EDC week 3 (enlarge)

View references

The end of our second week on cyberculture

Our second week continued with questions raised through films (including A New Hope and Cyborg) and books (Machines Like Me and Iain M. Banks’ series). Themes that particularly struck me include:

1) Assuming that ‘human’ is not an objective nor inclusive term (Braidotti 2013: 26), how might this affect how we think about ‘artificial intelligence’, power and agency?

2) If we take a ‘dynamic partnership between humans and intelligent machines’ (Hayles 1999: 288) as a point of departure, how might we consider concepts such as consciousness, (distributed) cognition and agency?

3) Can machines make ‘moral‘ decisions?

4) Building on a discussion about gender and ‘virtual’ identities, are we ‘performing’ or is it ‘performative‘? Should there be a distinction between ‘real’/’virtual’ here, and how do we define ‘real’? (The Matrix comes to mind here…) How might this play out in on our identities on Twitter, lifestream-blogs etc.?

5) Thinking beyond assumptions that the ‘human’ is at the centre of education, and technology is a ‘tool‘ or ‘enhancement‘, what are the implications of a complex entanglement of education and technology (Bayne 2015: 18) for this course?


Complex entanglement (‘Entanglement’, ellen x silverberg, Flickr)

Many discussions were via Twitter, drawing in questions from the public:

I have also been commenting on others’ lifestream-blogsbringing them in as feeds.

Following on from last week’s map, I have opened new and revisited old avenues:

EDC week 2
EDC week 2 (enlarge)

I have also experimented with visualisations of my feed ahead of our visual artefact task…

InfraNodus: Text network visualisation and discourse analysis (described as 'postsingularity thinking tool'
InfraNodus: Text network visualisation and discourse analysis (or ‘postsingularity thinking tool’)

View references

Michael favourited on Flickr: Entanglement by ellen x silverberg

I found this image while reflecting on Bayne (2015) and the complex entanglement of technology and education:

‘As researchers and practitioners of digital education, we need to move away from our over-emphasis on how technology acts on education, or how education can best act on technology. Let us rather acknowledge that the two are co-constitutive of each other, entangled in cultural, material, political and economic assemblages of great complexity.’ (Bayne 2015: 18)

Film review – ‘Cyborg’

Following my first film review on A New Hope, here is a second shorter post on The Cyborg, after being inspired to pick up on a theme from Matthew Taylor’s review of the same film (fearing technology):

The Cyborg

The Cyborg includes many aspects relevant to the themes we have been exploring, however one theme in particular struck me on rewatching it this week after a Twitter exchange: how/should we think about agency with regards to technology (for example, around the issues of fear and control, if we should even consider things in this way)?

The Cyborg portrays the ‘human’ exerting power over the ‘cyborg’ (the ‘human’ choosing its name and date of birth, as if it were a ‘tool’ without agency). This brings to mind the way technology is often seen as a ‘tool’ in education, rather than technology and education being ‘co-constitutive of each other, entangled in cultural, material, political and economic assemblages of great complexity’ (Bayne 2015: 18).

How, then, might we consider agency in this complex entanglement? Hayles (1999: 288) argues that ‘in the posthuman view…conscious agency has never been “in control”…distributed cognition replaces autonomous will’ and, in this talk and book, discusses the idea of the ‘cognitive nonconscious’.

I plan to dig further into how we might consider consciousness, cognition and agency with regards to technology and education as we continue with the course.

View references