Michael commented on Val Muscat’s EDC lifestream – Computers start composing

Computers start composing.

Michael Wolfindale:

Fascinating article, Val!

I also came across an article about algorithms being involved in the composition/improvisation of music while I was reflecting on how ‘machines’ ‘think’, how ‘humans’ ‘think’, and the blurred boundaries between the two from a posthuman standpoint.

Talking of computers being able to ‘swing’, jazz pianist and programmer Dan Tepfer uses a special ‘player piano’ (a piano with an onboard computer that can ‘play’ itself). In practice, the piano is able to ‘listen’ to what Dan plays and ‘respond’ (e.g. play additional notes) through an algorithm Dan has written.

It’s interesting how Dan speaks out about the process (“I’m not writing a piece, I’m writing the way the piece works”), and how this article describes the piano as ‘his composing partner’ (rather than as a ‘tool’ he controls):

NPR – Fascinating Algorithm: Dan Tepfer’s Player Piano Is His Composing Partner

Michael saved in Pocket: ‘Unthought Meets The Assemblage Brain’ (N. Katherine Hayles and Tony D. Sampson)

Capacious: Journal for Emerging Affect Inquiry
Capacious: Journal for Emerging Affect Inquiry


What transpires in the unmediated space-time excess that moves, at once, between and alongside cognition and recognition, between and alongside formation and information, between and alongside prehension and comprehension? Following upon their most recent books—N Katherine Hayles’  Unthought: The Power of the Cognitive Unconscious (University of Chicago, 2017) and Tony D Sampson’ s The Assemblage Brain: Sense Making in Neuroculture (University of Minnesota, 2016), the convergences and divergences that emerge and weave throughout this conversation are quite revealing.

View full article

I continue to consider the lines that often seem drawn between education/technology, ‘human’/’machine’, conscious/nonconscious and so on…

An article I shared previously asked ‘How do machines think?’ Yet, from a critical posthumanist perspective, what does it mean ‘to think’?

I reflect on this question whilst exploring the ideas of ‘cognitive assemblages’ and ‘nonconscious cognition’ in this discussion between N. Katherine Hayles and Tony D. Sampson…

Michael saved in Pocket: ‘Cognitive Assemblages?’

Illustration: Zbyněk Baladrán


Reading N. Katherine Hayles’ Unthought (University of Chicago Press, 2017), I’m struck by her notion of ‘cognitive assemblages’ to describe human-technical interaction which she discusses as fully imbricated. I wonder if the women and men whose careers in technology-driven work contexts we are exploring in Nordwit understand themselves as cognitive assemblages? In Hayles’ work agency is distributed, as are many other things such as responsibility – but do our research participants think of themselves in that way? The people I have interviewed in the context of Digital Humanities tend to take a rather instrumentalist view of technology, and we might want to ask, what difference does it make if you understand yourself as a ‘cognitive assemblage’ or as someone who makes use of technology – or, as academics can often feel, as a ‘victim’ of technology (the skype in my office isn’t working, we’re unable to project images etc.)?

View full article

The end of our second week on cyberculture

Our second week continued with questions raised through films (including A New Hope and Cyborg) and books (Machines Like Me and Iain M. Banks’ series). Themes that particularly struck me include:

1) Assuming that ‘human’ is not an objective nor inclusive term (Braidotti 2013: 26), how might this affect how we think about ‘artificial intelligence’, power and agency?

2) If we take a ‘dynamic partnership between humans and intelligent machines’ (Hayles 1999: 288) as a point of departure, how might we consider concepts such as consciousness, (distributed) cognition and agency?

3) Can machines make ‘moral‘ decisions?

4) Building on a discussion about gender and ‘virtual’ identities, are we ‘performing’ or is it ‘performative‘? Should there be a distinction between ‘real’/’virtual’ here, and how do we define ‘real’? (The Matrix comes to mind here…) How might this play out in on our identities on Twitter, lifestream-blogs etc.?

5) Thinking beyond assumptions that the ‘human’ is at the centre of education, and technology is a ‘tool‘ or ‘enhancement‘, what are the implications of a complex entanglement of education and technology (Bayne 2015: 18) for this course?


Complex entanglement (‘Entanglement’, ellen x silverberg, Flickr)

Many discussions were via Twitter, drawing in questions from the public:

I have also been commenting on others’ lifestream-blogsbringing them in as feeds.

Following on from last week’s map, I have opened new and revisited old avenues:

EDC week 2
EDC week 2 (enlarge)

I have also experimented with visualisations of my feed ahead of our visual artefact task…

InfraNodus: Text network visualisation and discourse analysis (described as 'postsingularity thinking tool'
InfraNodus: Text network visualisation and discourse analysis (or ‘postsingularity thinking tool’)

View references

Film review – ‘Cyborg’

Following my first film review on A New Hope, here is a second shorter post on The Cyborg, after being inspired to pick up on a theme from Matthew Taylor’s review of the same film (fearing technology):

The Cyborg

The Cyborg includes many aspects relevant to the themes we have been exploring, however one theme in particular struck me on rewatching it this week after a Twitter exchange: how/should we think about agency with regards to technology (for example, around the issues of fear and control, if we should even consider things in this way)?

The Cyborg portrays the ‘human’ exerting power over the ‘cyborg’ (the ‘human’ choosing its name and date of birth, as if it were a ‘tool’ without agency). This brings to mind the way technology is often seen as a ‘tool’ in education, rather than technology and education being ‘co-constitutive of each other, entangled in cultural, material, political and economic assemblages of great complexity’ (Bayne 2015: 18).

How, then, might we consider agency in this complex entanglement? Hayles (1999: 288) argues that ‘in the posthuman view…conscious agency has never been “in control”…distributed cognition replaces autonomous will’ and, in this talk and book, discusses the idea of the ‘cognitive nonconscious’.

I plan to dig further into how we might consider consciousness, cognition and agency with regards to technology and education as we continue with the course.

View references

Michael saved in Pocket: ‘Introduction: Thinking with Algorithms: Cognition and Computation in the Work of N. Katherine Hayles’ (Amoore 2019)


In our contemporary moment, when machine learning algorithms are reshaping many aspects of society, the work of N. Katherine Hayles stands as a powerful corpus for understanding what is at stake in a new regime of computation. A renowned literary theorist whose work bridges the humanities and sciences among her many works, Hayles has detailed ways to think about embodiment in an age of virtuality (How We Became Posthuman, 1999), how code as performative practice is located (My Mother Was a Computer, 2005), and the reciprocal relations among human bodies and technics (How We Think, 2012). This special issue follows the 2017 publication of her book Unthought: The Power of the Cognitive Nonconscious, in which Hayles traces the nonconscious cognition of biological life-forms and computational media. The articles in the special issue respond in different ways to Hayles’ oeuvre, mapping the specific contours of computational regimes and developing some of the ‘inflection points’ she advocates in the deep engagement with technical systems.

View full article

This article, from a Theory, Culture and Society special issue on Thinking with Algorithms: Cognition and Computation in the Work of N. Katherine Hayles relates to some of the articles and videos I have recently shared on ‘machines’ and cognition (particularly around this idea of ‘nonconscious cognition’). I’m also saving it here as it will no doubt be relevant to our later block on algorithmic cultures!