Block 1 – Cyber Cultures Artefact: Tinkering. So what was that all about then?

The first block of this course looks at Cyber Cultures. Our task was to create a visual artefact that looked at an aspect of this topic, so I opted to make a short film.

If you feel like leaving a comment then please do 🙂

The video that I put together to round off block 1 was an attempt to pull together many of the different concepts that we looked at during those first weeks of the course. The story that unfolds during the short film hinges on these concepts, and both the decisions made by the aliens and their consequences,  expand on some of those ideas.

Welcome to Earth

At the start of the film we meet our characters – two aliens (Morrison and Moore) who have travelled to Earth to kick-start a “Golden Age”. They try to do this by accelerating human use of Artificial Intelligence. This brings in our first concepts. The human who is developing a new AI (Bert Giesta) has an Instrumentalist view of the technology that he is building – it will be an aid to his role as a Learning Technologist, and he isn’t considering that it could be used for anything else. In contrast, the aliens are Essentialist in their attitude towards AI – simply by being what it is, AI will necessarily usher in a Golden Age.

A problem with both of these viewpoints is a tendency to “Black Box” technology – it does what it says on the tin and nothing else, nobody really needs to look under the hood and know how it works.  In the film humanity Black Boxes the AI developed by Bert (with a little alien help) and it gets used in every human computer system, without anyone knowing how it works. In fact nobody can know how it works because aliens have tinkered with it and nobody has bothered to check.

I need your clothes, your motorcycle, and the plot from your movie…

The first trip into Earth’s future shows the result of the aliens’ tinkering. This is the sort of Dystopia that comes up a lot in fiction about Artificial Intelligence. The AIs decide that humanity is a problem, and decide to wipe them out. This is broadly the plot of Terminator. This version of the future only happens because humanity Black Boxed their AI, and failed to consider the repercussions of this. To get back to the future that they wanted, our aliens go back in time and add some more code to the AI preventing it from wiping out humanity. While this works, it does not address the underlying problem of humanity’s attitude to the technology that they have created.

Why are Utopias always so clean? Are they run by Rhoombas?

The second trip to the future begins to look like the Utopia that the aliens wanted to create. However, in this time-line humans have bonded with their machines. This is the Transhumanist future. In our story, as in many dealing with Transhumanist concepts such as Cyborgs, the fusing of man and machine gives the individuals considerable “upgrades” on their original human form, but at a price. The machine parts and Artificial Intelligence systems rob them of some essential spark of humanity. While this would not necessarily happen in the real world, it makes for a good story and is covered extensively in Science Fiction. To prevent this problem our aliens again travel back in time and tinker with the code, to make the AI more human, and give it a simulated emotional range. Again, they do not address the central problem of Black Boxing…

I had to get at least one reference in to Transmetropolitan…

The third version of humanity’s future is a wild, chaotic, Cyberpunk world.  In fiction looking at a future like this, society is usually riven with inequality. Technology is a powerful tool, but it is largely in the possession of the wealthy and powerful, while the majority of mankind lives as an underclass. The AIs in our story may have emotion as well as intelligence, but they lack empathy. They are used for making money, hedonistic pursuits and to reinforce power structures. Cyberpunk futures tend to be places where progress has stalled. The heroes of these stories are usually those who rebel against the existing power structure, but often they discover that they are simply pawns. In our story the aliens decide that equipping the AI with an ethical structure will enable it to become a balancing factor in human development.

The final version of the future echoes the first one – humanity all but wiped out by war. The reason for this in our story is that ethics are mostly reflective of the culture that develops them.  If those cultures who develop AI graft their ethical systems onto the AI, then they will benefit from the advances possible, but will end up leaving important ethical decisions to their machines. As ethics are a human concept, they are just as fallible as anything else human, so superpowers end up trapped in self reinforcing feedback loops where they see themselves as morally superior and ethically justified in wiping out their enemies.

In the end the different futures are not caused by some essential nature or flaw of Artificial Intelligence. They aren’t even caused by alien tinkering. Each future is caused by humanity not considering what they are doing. AI doesn’t cause any of these things to happen by its own agency, the mistakes are all too human. Posthumanism asks questions about what it is to be human, and if we can rely on the assumptions made previously about this.  Critical Posthumanism is the lens through which much research into the use of technology is currently being carried out.

Below are some definitions for the terms highlighted above. For the styles of fiction I have included some examples of movies or novels that broadly fit into these categories. Following the definitions are some references showing where these concepts have come up in the course reading and elsewhere.

Cyber Cultures

This is one of the phases described by Knox in Critical education and digital cultures from Springer Encyclopaedia of Educational Theory and Philosophy and it forms the first of the three blocks in the Education and Digital Cultures module of the MSc in Digital Education at Edinburgh University (sorry for this long winded description if, like me, you’re studying on this course. I figured that since this blog is open to the public I should be thorough). Knox (2015) describes Cyber Culture in this way:

This phase is associated with an increased interest in and awareness of ideas developed in cybernetics and diffused through science fiction literature and film. Key here is the figure of the cyborg; an augmented human being that represents both a cybernetic arrangement of the blurred boundaries between the living and the machinic, as well as a disturbing Sci-Fi vision of the consequences of increasing technological development.

Instrumentalism

Instrumentalism is the position that technology is just a tool to achieve a certain end, rather than being intrinsically linked to human behaviour and development. For example, an instrumentalist may view a screwdriver as a tool for tightening or loosening screws, while the object itself also has affordances to be used scraping cheese off of a pizza or wedging a door closed. Bayne (2015) describes instrumentalism like this:

[I]nstrumentalism constructs technology as a set of neutral entities by which pre-existing goals (for example, ‘better’ learning) can be achieved.

Essentialism

Essentialism is the position, often taken by default, that simply by existing a certain piece of technology will have a specific effect. For example, simply posting a video online of you pouring iced water over yourself will have an effect on ALS research. even if nobody watches it, or if those who do don’t know about the Ice Bucket Challenge being related to raising money for that research. Bayne (2015) describes essentialism like this:

[E]ssentialism attributes to technology a set of ‘inalienable qualities’ immanent to the technological artefact

Black Boxing

Another position that people can take by default. They treat a piece of technology in terms of inputs and outputs without looking at how it turns the former into the latter. An analogy that I like is to consider the toaster in your kitchen. If you don’t know how it works, what its for or how to adjust it, then you could think that the purpose of a toaster is to set off your smoke alarm. You input bread and click the lever, and a little later the smoke detector goes off. If you know it’s for making toast, and that there is a way of adjusting how long it heats the bread for, then you get delicious toast and no infernal beeping noise.

Dystopia

Click this link for a definition. Yeah, I know its Wikipedia. Bite Me. Some examples of well known Dystopian fiction would be:

    • Terminator
    • Nineteen Eighty-Four
    • V for Vendetta
    • Brazil
    • The Hunger Games
    • Logan’s Run

Utopia

Click this link for a definition. Yeah, I know its Wikipedia again. Bite me again. Some examples of well known Utopian fiction would be:

    • Star Trek – United Federation of Planets
    • Buck Rogers in the 25th Century
    • Gallifrey (Doctor Who)

Transhumanism

Transhumanism, sometimes shortened to H+ or Human+, is an area of philosophy that looks at how humanity can be enhanced through the use of technology.  Transhumanism is an extension of Humanism, which considers the key features of humanity to be rationality, autonomy and dominance over ‘nature’. Through using technology, transhumanism argues that these humanist ideals can be perfected.

Cyborg

Short for Cybernetic Organism. The broader definition encompasses any combination of synthetic and organic parts in an organism. Miller (2011) has my favourite definition, and it breaks down into four different types of Cyborg:

    • Restorative (technology used to restore lost functions – e.g. prosthetic limbs, artificial heart)
    • Normalising (technology used to bring organic systems back in line with what is considered “normal” – e.g. pacemakers, spectacles)
    • Enhancing (technology used to improve on “normal” human ability – e.g. night vision goggles, improved strength, communication technology)
    • Re configuring (technology used to change “normal” human traits without necessarily improving upon them – e.g. tattooing, collagen injections, breast implants).

Looking at this very broad definition of a cyborg, we can see that it would be very rare indeed to encounter a human being today who is not, in some way, a cyborg. The key thing that people overlook is that technology does not need to be grafted on to a human to qualify, like the cyborgs of fiction. It just needs to being common use for that individual.

Cyberpunk

Click this link for a definition. Yeah, I know its another from Wikipedia. Continue to bite Me. Some examples of well known Cyberpunk fiction would be:

    • Bladerunner (Do Androids Dream of Electric Sheep)
    • 12 Monkeys
    • Neuromancer
    • Johnny Mnemonic
    • Shadowrun (Role Playing Game)
    • Cowboy Bebop
    • Alien, Aliens etc.

Posthumanism

Another school of philosophical thought. Posthumanism differs from Transhumanism in that argues against the core definitions of Humanism as being correct when deciding what makes us human. Like Transhumanism, Posthumanism is usually connected to technology. The difference being on focus rather than being in opposition. For example, a Transhumanist might argue that a Cyborg is more than human as it is an improvement on the possibility of achieving humanist ideals. A Posthumanist would ask why those humanist ideals are not a measure of what it means to be human, so a cyborg wouldn’t be an “improvement”, just a variation.

Easter Eggs – I couldn’t resist putting a few Easter Eggs into this short film.

    • The aliens are named Morrison and Moore after Grant Morrison and Allan Moore, two of the biggest names in comic book writing. Morrison’s writing tends towards an optimistic ideal, with human nature and freedom winning against the odds. Moore’s writing is more pessimistic, and while humanity can win the day (for instance in V for Vendetta) it is always at a cost. Initially I was going to record their conversation as audio, with them sounding like characters from The Goon Show, but in the end I preferred the idea of having appropriate music for each section.
    • I named the Learning Technologist Bert Giesta as a slight dig at Gert Biesta. In the IDEL module we read some of his work and I thought he was a bit of a reactionary wind-bag, not the sort of person to embrace anything new, let alone develop it.
    • The bit of code to prevent the AI from turning on its creators was taken from a Tweet that I happened upon and included in my Lifestream.
    • The music for the Dystopian future was styled to be similar to the music on the Fallout computer games – old jazz played on scratched vinyl.
    • The first still from the Utopia segment is quite niche – I was explaining Transhumanism to a friend and their reply was “what a load of balls”, so I used a picture of some large spherical buildings.
    • The reference to this future looking “clean” is to do with a trope in science fiction films – if the future looks high tech but clean its a Utopia, if its high tech but grubby its a Dystopia.
    • The Cyberpunk future was also influenced by my love of comic books. I based a lot of this on the series Transmetropolitan by Warren Ellis. The picture of the bald man in shades is someone cosplaying the main character from that story.
    • The final picture of a destroyed fun fair is also a reference to comic books. The Joker  from Batman has ethics, but they’re completely twisted to anyone else. My thought here was the connection that ethics in themselves can often be “Black Boxed” as something inherently good, without considering who came up with them in the first place and why.

References

Bayne, S. (2015) What’s the matter with ‘technology-enhanced learning’?,
Learning, Media and Technology, 40:1, 5-20

Haraway, Donna (2007) A cyborg manifesto from Bell, David; Kennedy, Barbara M (eds), The cybercultures reader pp.34-65, London: Routledge. 

Hayles, N. Katherine (1999) Towards embodied virtuality from Hayles, N. Katherine, How we became posthuman: virtual bodies in cybernetics, literature, and informatics pp.1-25, 293-297, Chicago, Ill.: University of Chicago Press. 

Knox, J (2015) Critical education and digital cultures. in M Peters (ed.), Encyclopedia of Educational Philosophy and Theory. Springer, pp. 1-6. 

Miller, V. (2011) Chapter 9: The Body and Information Technology, in Understanding Digital Culture. London: Sage. 

Ross, J., Bayne, S., Lamb, J (2019). Critical approaches to valuing digital education: learning with and from the Manifesto for Teaching Online. Digital Culture & Education, 11(1), 22-35

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *