Week 10 – Algorithmic Play

https://www.timetoast.com/timelines/2214815

Choosing my social media – For my algorithmic play activity, I decided to focus on my most frequently visited social media site – YouTube. I watch a variety of YouTube clips almost every evening as one of my primary sources of news, and I often follow my selected pathway of clips, as recommended by the YouTube algorithms.
That said, I was interested to monitor more closely how the algorithms ‘guide’ me in my recommended viewing and personal decision making, and whether or not, I was actually in the driving seat.

I decided to start with the type of video that I watch almost every day – The View. This is a daytime American talk show, co-hosted by four women, including Whoopi Goldberg. The show is highly political in nature, and discussion amongst the women mostly centres around American politics, and in particular the Trump Administration.

Methodology – Before starting the algorithmic play I decided to delete the watched history, to avoid algorithmic influence from previous YouTube sessions and other videos I had watched.
When the sidebar of recommendations was shown, I opted for clips that piqued my own personal interests and I always chose one from the top six recommended videos. I continued to click through the recommendations for over 30 videos, and I recorded where it took me. The results of this pathway are shown on the Timetoast attached to this blog.

Reflections – It is clear that starting the algorithmic play with a television programme such as the View, resulted a preordained pathway being lain by the algorithms. The View, as mentioned, is highly political with three-quarters of the panel coming from liberal, left-wing backgrounds. The programme has a high degree of ‘Trump-bashing’ and although there is one Republican panellist, Megan McCain, unlike the majority of the Republican Party, she is an ardent outspoken critic of President Trump. The initial clip I watched focused on the Trump Administration’s lacklustre response to the Coronavirus pandemic.
It seems that the tenor and tone of this particular clip was highly influential on the subsequent recommended pathway suggested by the algorithms. Each of the clips that followed were imbued with the following themes:-

• Left-wing liberal news organisation e.g. Vox Media/ CNN/ MSNBC (20 clips)
• Anti-Trump (9 clips)
• Coronavirus Pandemic (7 clips)
• Race relations (4 clips)
• Brexit (2 clips)

At several points the algorithms had restricted my options, limiting what I could see and what I could choose for a period of time. In particular this happened when I selected the first of the Vox media clips. This resulted in being ‘stuck’ with only Vox choices to choose from for another 12 selections.
In order to change the options of the algorithm, I purposely selected a video clip that would create a new direction. This worked, and I was able to ‘escape’ the Vox loop and move onto content created by other organisations, although still within left wing, liberal media.

On reflection, it seems that there was certainly a loop of information, with the algorithms directing me to clips with very similar themes and information. At no time was I directed towards media such as Fox News or other right-wing media groups.

It is clear that with this type of ‘algorithmic power’ or ‘algorithmic governance’, there is threat of algorithms giving rise to confirmation bias in users. This is highly problematic, particularly in a society that is extremely polarised in political opinion.  How can society ably solve problems if it is unable to objectively see the other side of an argument? If algorithms do not show me alternative political opinion, how will I ever be able to understand opposing perspectives? This type of algorithmic echo chamber, is therefore very dangerous. This has congruence with the view put forward in by Rob Kitchin who states that “Far from being neutral in nature, algorithms construct and implement regimes of power and knowledge…. Algorithms are used to seduce, coerce… regulate and control: to guide and reshape how people… and objects interact with and pass through various systems… ” (Kitchin, 2017, 19).

Ethical Issues – There are a certainly some ethical issues to consider. For instance, there would be very little doubt, having seen my list of viewed videos, as to which end of the political spectrum I belonged. Could this data be misused or manipulated? Do my political affiliations no longer hold the same degree of privacy as they had done in the past, now that such data is widely available to large companies and organisations?

Another ethical consideration is how to disentangle between private and professional. As a user of YouTube both at home and at work, it is important to the ensure that the algorithms do not unnecessarily reveal private data and personal preferences in a professional setting, and so ensuring appropriate log ins are used in each of the settings.

References
Kithcin, R. (2017) Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Comment on ‘Val Muscat’s EDC Lifestream’ by vmuscat

Comments on Val’s Visual Artefact

I was really impressed by this and I think you have shown the juxtaposition of the ‘old’ and ‘new’ spaces within education very well. I would posit that, even in today’s 21st century classrooms, there are still significant numbers of educators who would be far more at home sitting on the left-hand desk, than that of the one of the right. I can immediately think of a handful of individuals in my own institution who would fall into this category – the proud Luddites, digital laggards and techno-sceptics, who resist, bemoan and detract at even this tiniest suggestion of technological advancement within their teaching practice.

However, your image could also represent the theme of digital divides that exists in technology enhanced learning, and this is one of the key themes explored in the Digital Education in a Global Context module. It considers how there are deep regional and global divisions in the way in which technology is accessed and utilised for education – the ‘haves’ and the ‘have-nots’ and all those in between. It also examines how, as technology advances, there comes rapidly changing cultural norms, and discusses how the need to keep apace with those norms creates a high degree of friction amongst educators and their students. I wonder how many learning spaces and educators around the world are sitting comfortably on the right hand desk? Very thought provoking 😊

https://www.thinglink.com/scene/1279856915081330690

Ewins, R. (2019) White Paper in Digital Divides, from Digital Education in a Global Context module.

Comment on ‘Charles’s EDC Lifestream’ by cboyle

Comments on Charles’s Visual Artefact

Hi Charles,

Thank you for your beautiful pictures. I have been to Budapest many times, and I have an apartment just near Heroes Sq, so it holds a very special place in my heart.

As a teacher of secondary history, your quotation had real poignancy for me, as it is something I often proclaim to my students, particularly when teaching certain topics that show humans making the same mistakes over (WW1 followed by WW2, African slavery in America followed by the Jim Crow etc). It also brought me in mind of similar quotation from German philosopher Georg Hegel who said ‘We learn from history, that we do not learn from history’.
I would echo what Jeremy has said about how this applies to digital education, in that it seems that there are we are perhaps repackaging the same thing and delivering it over and over again with a nothing more than a rebrand.

I would also extend those comments to education on a wider scale. Every few years teachers are promised paradigm shifts in educational practice through new delivery methods, pedagogies and approaches to learning. But speaking to older colleagues, these only engender a feeling of déjà vu, and a ‘been there, done that’ mentality.

But I wonder, does the ever-changing vernacular of digital education really matter? Surely, the rapidly evolving nature of technology is what is significant in making educational change. Where pedagogies and schools of thought can be cyclical, emerging technologies can only give an upward and onward trajectory to be truly transformative in education.

‘Those who cannot remember the past are condemned to repeat it.’ (George Santayana)

 

Comment on ‘Jon Jack’s EDC Lifestream’ by jjack

Great video Jon. At first, I thought this is what had really happened in the production of your visual artefact, so you had me going all the way to the end!

You say: ‘some techno evangelists will believe that all thing technical will enhance the experience regardless of how it’s used

I have known a few of these individuals in my time in education and they are difficult characters to manage. I used to know one head of school, who was such an ardent believer in the power of technology for pushing boundaries in education, that he brought in any technological development he came across. The only problem was that this done was indiscriminately and without due diligence, and resulted in a number of platforms and technologies working in direct competition to each other. The result was an ‘gordian knot’ of tech, that was difficult, and in some cases near impossible, to disentangle.

https://edc20.education.ed.ac.uk/jjack/2020/02/02/visual-artefact-week-3/

Comment on ‘Adrienne O’ Mahoney’s EDC Lifestream’ by amahoney

Comment on Adrienne’s Visual Artefact

Hi Adrienne. Thank you for sharing your visual artefact. ‘The cyborg is a feature of social reality, as well as science fiction.’ This theme has also had an impact on me, and has helped me to realign my understanding of what we mean by the term cyborgs. Prior to starting the course, I had little concept of the ‘social reality’ of cyborgs and had not fully considered the real world application of the terminology. The use of this quotation in your artefact had  poignancy for me.

However, I would challenge your assertion that popular culture still depicts female cyborgs as vulnerable. I would ask you to reconsider this by looking at the recent examples of the female Terminator (the TX) in Terminator 3 (2003), as well as the Seven of Nine character from Star Trek: Voyager series and more recently Star Trek: Picard (1997 – 2001, 2020). Both are represented through a strong and tenacious characterisation. Is there perhaps a wind of change, or are these anomalies in how the female cyborg is represented in popular culture?

https://media.heanet.ie/page/2be5a5d9a18d4b79908482d1cd8ff7aa

 

Comment on ‘Teaching @DigitalCultures’ by dyeats

Comments on Human Digital Screenome Memoir

A really great visual artefact here David. Thank you for sharing this. I had never heard of the Human Screenome Project before, but it certainly makes me consider its practical use and potential impact.

I recently delivered an assembly to our Year 10 students on the topic of ‘screen time’. As a prop in the assembly I shared the data from the app, Moment, on my iPhone, to visualises how much screen-time I had personally spent on my device over the 4 weeks prior. As it turned out, it was quite a lot and that didn’t include hours spent on my iPad, laptop or desktop etc. It was quite shocking actually!

I also couldn’t understand specifically what I had been doing over that time… It felt like a lot of ‘lost’ hours. So, something of this nature, which adds an extra layer of data on the specific patterns of online activity, would certainly have been useful. There is definitely a practical application that this type of data could generate for users.

That said, I don’t think I would have been so willing to share that visual data with my audience, and so privacy is definitely an issue to consider.

Thank you again for sharing this.

https://www.youtube.com/watch?v=Se7L5zDVNPs&feature=youtu.be

Cybercultures Visual Artefact Feedback – In reply to dyeats #MSCEDC https://t.co/WRDuiYR03A by bkerr

In reply to dyeats

Thank you for your comments David.

Yes, I did see ‘Years and Years’ when it first screened. However, I was far more enticed to the programme by its socio-political storylines rather than its commentary on the development of technology. In fact, as the series progressed and the daughter went ever further down her transhumanist journey, I became increasingly frustrated as a viewer and felt that the show was deviating from what was a hard-hitting imagination of a (not-too-distant or implausible) dystopia created by Trumpian policy, Brexit Britain, the migrant crisis and a whole host of other ‘real-life’ contemporary issues. At the time, the transhuman storyline just didn’t ring quite true for me.

It’s odd how my perception of the show has changed since starting this course and I have delved more and more into scholarly analysis of transhumanism and posthumanism. Having re-evaluated the characterisation of the transhumanist daughter it is possible to see that her extropian ideals are actually widely mirrored by many youngsters today, and the idea of biohacking is gaining traction amongst younger people. The scene where the wonky cybernetic eye implant that had installed by back-street charlatans, may not be that far removed from the reality of our near future.

I think you also raise some interesting points here about the ownership of technology, and how there may tensions that could arise, particularly in terms of governmental/ corporate ownership, and how much control they could assert over posthumans. As Hayles states, ‘consider the six-million dollar man… As his name implies, the parts of the self are owned, but they are owned precisely because they were purchased, not because ownership is a natural condition”. She goes to onto say, “similarly, the presumption that there is agency, desire or will belonging to the self and clearly distinguished from the “wills of others” is undercut in the posthuman, for the posthuman’s collective heterogenous quality implies a distributed cognition located in disparate parts that may be in only tenuous communication with one another.” How will we be able to reconcile this dichotomy between self and ownership in a posthuman world? Certainly, within an educational context, there is already tremendous challenge in regards to ownership of technology and how it could/should be used for educational purposes. How much more difficulty and tension will schools and colleges face, when these issues are being discussed within a transhumanist/posthuman environment?

Hayles, K. (1999) ‘Towards embodied virtuality’, in How we became posthuman: virtual bodies in cybernetics, literature, and informatics, pp 1 – 24, Chicago: University of Chicago Press.

source https://edc20.education.ed.ac.uk/bkerr/2020/02/01/block-1-cybercultures-visual-artefact-mscedc-https-t-co-wrduiyr03a/#comment-19

Cybercultures Visual Artefact Feedback – In reply to jknox

In reply to jknox.

I think you raise a very interesting point here Jeremy. Much of what has been discussed in terms of technological enhancements to humans has been from a wider social perspective. If we look at the issue from a narrower point of view, viz. technological enhancement of school children, I ponder what impact transhumanism would have on our education systems.
Take for instance the most basic of the ‘real life’ enhancements from my video – the insertion of a data chip into the back of an individual’s hand. What educational opportunities would this avail, should schools and colleges insist on turning their students into an army of mini-cyborgs? There would certainly be many benefits from an administrative point of view; for example, schools would be able to undertake expedient and instantaneous attendance as children walk through the school gates, an essential part of their safeguarding procedures. In addition to this, cashless cafeterias would ensure the speedy distribution of lunches, and collate data on the types of food children are eating, with this being visible to both parents and teachers. Furthermore, this could form the basis for health-based discussions and schemes of work centred around real-life consumption data. An implanted chip that recorded biometric data from the child’s body in the same manner as a Fitbit or Apple Watch, would also be highly valuable for PE departments in devising personalised fitness plans and class setting according to physical ability. The list of educational benefits could go on….

However, the ethical concerns surrounding what is being suggested here are glaringly obvious, particularly in reference to privacy, and Orwellian oversight of young people. As such, to my mind, the acceptance of this type of technical enhancement within mainstream education is a non-starter, certainly for primary and secondary aged children. I cannot imagine any teacher, senior manager, head of school, or politician that would be able to put forward a convincing enough ‘educational’ argument that would supersede the ethical implications of doing such a thing. The educational arguments are strong but surely the ethical considerations will always win for the parents and children.

Cybercultures Visual Artefact Feedback: In reply to Crouchipuss

In reply to Crouchipuss.

Thank you so much for you comments. I’m glad you enjoyed my visual artefact.

Yes, I was trying to encapsulate the view being argued by Knox (2015), that digital education has “largely shifted away from the phase of cybercultures, towards the view of an educational world in which technology is more firmly embedded, but importantly subservient to its human users”. The dystopian image of cybercultures and cyborgs, that has been diffused to us through science fiction, is exactly that – fiction! Knox says the “next phase of education and digital cultures reveals a pacification and instrumentalism of technology for predefined social ends”. I had hoped for the artefact in the latter half of the video, was able to represent this change in how cybercultures can viewed and imagined in more positive ways.

I think you draw a very interesting parallel regarding how those who have altered the human form through technology, aside those who had done so through gender reassignment. I agree that both would certainly receive prejudice rooted in ‘otherism’ and in the belief that the individual has done something ‘unnatural’. However, I think that perhaps fear and hatred of the cyborg is fuelled more from a mistrust of technology, as well as the augmented abilities that a technological enhancement may provide to a human. So whilst there are some similarities in the prejudice, I don’t think its exactly the same.