Week 10 – Algorithmic Play

https://www.timetoast.com/timelines/2214815

Choosing my social media – For my algorithmic play activity, I decided to focus on my most frequently visited social media site – YouTube. I watch a variety of YouTube clips almost every evening as one of my primary sources of news, and I often follow my selected pathway of clips, as recommended by the YouTube algorithms.
That said, I was interested to monitor more closely how the algorithms ‘guide’ me in my recommended viewing and personal decision making, and whether or not, I was actually in the driving seat.

I decided to start with the type of video that I watch almost every day – The View. This is a daytime American talk show, co-hosted by four women, including Whoopi Goldberg. The show is highly political in nature, and discussion amongst the women mostly centres around American politics, and in particular the Trump Administration.

Methodology – Before starting the algorithmic play I decided to delete the watched history, to avoid algorithmic influence from previous YouTube sessions and other videos I had watched.
When the sidebar of recommendations was shown, I opted for clips that piqued my own personal interests and I always chose one from the top six recommended videos. I continued to click through the recommendations for over 30 videos, and I recorded where it took me. The results of this pathway are shown on the Timetoast attached to this blog.

Reflections – It is clear that starting the algorithmic play with a television programme such as the View, resulted a preordained pathway being lain by the algorithms. The View, as mentioned, is highly political with three-quarters of the panel coming from liberal, left-wing backgrounds. The programme has a high degree of ‘Trump-bashing’ and although there is one Republican panellist, Megan McCain, unlike the majority of the Republican Party, she is an ardent outspoken critic of President Trump. The initial clip I watched focused on the Trump Administration’s lacklustre response to the Coronavirus pandemic.
It seems that the tenor and tone of this particular clip was highly influential on the subsequent recommended pathway suggested by the algorithms. Each of the clips that followed were imbued with the following themes:-

• Left-wing liberal news organisation e.g. Vox Media/ CNN/ MSNBC (20 clips)
• Anti-Trump (9 clips)
• Coronavirus Pandemic (7 clips)
• Race relations (4 clips)
• Brexit (2 clips)

At several points the algorithms had restricted my options, limiting what I could see and what I could choose for a period of time. In particular this happened when I selected the first of the Vox media clips. This resulted in being ‘stuck’ with only Vox choices to choose from for another 12 selections.
In order to change the options of the algorithm, I purposely selected a video clip that would create a new direction. This worked, and I was able to ‘escape’ the Vox loop and move onto content created by other organisations, although still within left wing, liberal media.

On reflection, it seems that there was certainly a loop of information, with the algorithms directing me to clips with very similar themes and information. At no time was I directed towards media such as Fox News or other right-wing media groups.

It is clear that with this type of ‘algorithmic power’ or ‘algorithmic governance’, there is threat of algorithms giving rise to confirmation bias in users. This is highly problematic, particularly in a society that is extremely polarised in political opinion.  How can society ably solve problems if it is unable to objectively see the other side of an argument? If algorithms do not show me alternative political opinion, how will I ever be able to understand opposing perspectives? This type of algorithmic echo chamber, is therefore very dangerous. This has congruence with the view put forward in by Rob Kitchin who states that “Far from being neutral in nature, algorithms construct and implement regimes of power and knowledge…. Algorithms are used to seduce, coerce… regulate and control: to guide and reshape how people… and objects interact with and pass through various systems… ” (Kitchin, 2017, 19).

Ethical Issues – There are a certainly some ethical issues to consider. For instance, there would be very little doubt, having seen my list of viewed videos, as to which end of the political spectrum I belonged. Could this data be misused or manipulated? Do my political affiliations no longer hold the same degree of privacy as they had done in the past, now that such data is widely available to large companies and organisations?

Another ethical consideration is how to disentangle between private and professional. As a user of YouTube both at home and at work, it is important to the ensure that the algorithms do not unnecessarily reveal private data and personal preferences in a professional setting, and so ensuring appropriate log ins are used in each of the settings.

References
Kithcin, R. (2017) Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

“Algorithms are still made by human beings, and those algorithms are still pegged to basic human assumptions… They’re just automated assumptions. And if you don’t fix the bias, then you are just automating the bias.” https://t.co/dWrDdN1FyL

Liked on YouTube: The Truth About Algorithms | Cathy O’Neil

Some key takeaways from this short clip, that have congruence with the themes of the Algorithmic Cultures block.

Cathy O’ Neil argues that algorithms being presented as objective fact is a lie. She says ‘a much more accurate description of an algorithm is that it’s an opinion embedded in math“.  “There’s always a power element here” she adds, and that “every time we build an algortihms, we curate our data, we define success, we embed our values into algorithms.”

 

Algorithms of Oppression

“Part of the challenge of understanding algorithmic oppression is to understand the mathematical formulations to drive automated decisions are made by humans being. While we often think in terms such as ‘big data’ and ‘algorithms’ as being benign, neutral, or objective, they are anything but. The people who make these decisions hold all types of values, many of which openly promote racism, sexism and false notions of meritocracy, which is well documented in the studies of Silicon Valley and other tech corridors”

– Noble, S, (2018) Algorithms of Oppression

Article showing congruence with Rob Kithcin’s view that ‘we are entering widespread era of algorithmic governance, where algorithms will play an increasing role in the exercise of power’ (Kitchin, 2017) https://t.co/A2hbTjrAsl via @Technology_NS

Liked on YouTube: How algorithms shape our world – Kevin Slavin

Some strong links with the core reading in this YouTube clip that identifies the ‘pervasive’ nature of alogorithms and how they shape and mould every day life.  Similar to what David Beer argues that “algorithmic systems feed into people’s lives, shaping what they know, who they know, what they discover and what they experience.  The power of algorithms here is in their ability  to make choices, to classify, to sort, to order and to rank.” (Beer, 2017, 6)

References

Beer, D. (2017) The social power of algorithms, Information, Communication & Society, 20:1, 1 – 13.

Liked on YouTube: Digital Anthropology Daniel Miller

This YouTube clip was excellent! It really helped me to contextualise the relationship between ethnography and the digital studies. Leading anthropologist Daniel Miller, from University College London,  says ‘the very best way of way of understanding the digital in as much as when we talk about the digital, clearly we’ve got to be interested in the consequences that it has for people” and “that is really why this is the right domain for anthropology.”

 

BBC Sounds: The Documentary: MOOCs

BBC Sounds Podcast: Click to access podcast

Some key takeaways from the podcast:-

“Could these new free online university courses open higher education up to parts of the world in a way that’s been unthinkable up until now, or are MOOCs an experiment that could destroy centuries of tradition?”

Amazing fact – at Harvard more people have signed up to its MOOCs than have graduated in its 300 year history!!

Liked on YouTube: The Online Community-A New Paradigm: Mark Wills at TEDxSanLuisObispo

 

 

Lister (2009) attempts to define the meaning of ‘online community’ in the context of an environment where ‘the participants are there but not there, in touch but never touching, as deeply connected as they are profoundly alienated’ (Lister, 2009, 209). Thus, this poses a number of challenges, for definitively identifing ‘online community’

Mark Willis’ TEDx aims to show how he believes a real community can be  fostered within an online setting, by focusing attention on 4 criteria for success – longevity, shared values, community management and trust.

Lister, M. (2009) ‘Networks, users and economics’ in New media: a critical introduction, pp163 – 236, London: Routlede

 

 

 

 

Liked on YouTube: “Introduction to communities of practice,” (Wenger-Trayner, 2015)

Saadatdoost et. al posit that “cohesion in a MOOC community is brought about by the domain of doubts, questions, new knowledge, experiences and the community of learners who meet people around the world with similar interests” (Saadatdoost, 2014, abstract). They go on to discussion how this community of practice, as outlined in the clip, has largely been left undiscussed in reference to the study of MOOCs. Interestingly, this clip fails to provide MOOCs as exemplification of where community of practice could be revealed.

References 

Saadatdoost, Robab; Sim, Alex Tze Hiang; Mittal, Nitish; Jafarkarimi, Hosein; and Hee, Jee Mei, “A NETNOGRAPHY STUDY OF MOOC COMMUNITY” (2014). PACIS 2014 Proceedings. 116. http://aisel.aisnet.org/pacis2014/116

 

 

 

A dystopian novel that imagines the opposite of Lee’s vision of ‘more control’. Instead it shows an oppressive, divided society, examing the possible winners and losers of human immortality. An easy read but it has some links to the themes we have explored in the cybercultures.

Newton Lee in The Transhumanism Handbook (2019) defines transhumanism as ‘using science and technology to enhance or alter our body chemistry in order to stay healthy, and be in more control of our lives’. This brought to mind the novel The Suicide Club https://t.co/fmrRZVN58P

Liked on YouTube: Biohacker Explains Why He Turned His Leg Into a Hotspot | WIRED

 

Biohacker Michael Laufer recently had a 512GB drive implanted in his leg, which can store data, stream music or movies, and power a hot spot and mesh network. It’s called the PegLeg, and WIRED’s Daniel Oberhaus spoke with Laufer about the device and the field of biohacking.

For more of Daniel’s reporting on Laufer, his PegLeg and Biohacking technology, visit WIRED.com: https://ift.tt/2HAdH5o

 

 

Liked on YouTube: Experimenting with Biochip Implants

 

Humanity just made a small, bloody step towards a time when everyone can upgrade themselves towards being a cyborg. Of all places, it happened in the back room of a studio in the post-industrial German town of Essen.

It’s there that I met up with biohacker Tim Cannon, and followed along as he got what is likely the first-ever computer chip implant that can record and transmit his biometrical data. Combined in a sealed box with a battery that can be wirelessly charged, it’s not a small package. And as we saw, Cannon had it implanted directly under his skin by a fellow biohacking enthusiast, not a doctor, and without anesthesia.

Called the Circadia 1.0, the implant can record data from Cannon’s body and transfer it to any Android-powered mobile device. Unlike wearable computers and biometric-recording devices like Fitbit, the subcutaneous device is open-source, and allows for the user the full control over the data.

 

Very sad, but interesting article in the Guardian today. Do our daily interactions with technology mean that we are all gradually curating an indelible digital version of one’s self? Our own digital legacy of life https://t.co/EeqJYdYbzI

Great article on why Japanese do not fear robots to the same extent as the West. It attributes the religion of Shinto, which affixes spirits to humans, animals and inanimate objects, as one of the major factors. ‘All things have a bit of soul’ #MSCEdc https://t.co/tnWzIL9vFg

@harMonica1 @YouTube .. our school due to the essential and transformative role it has within their education. I think its more important to think about how we manage the use of technology for youngsters so that they are educated about best practice and appropriate use.

@harMonica1 @YouTube I also teach children, but older students in secondary. Their lives are imbued with technology and, sadly, I am regularly witness to its negative impacts – social disengagement, cyberbullying, tech addiction etc. That said, I could not in good conscience ever remove it from…

@JemMeganMay Fab article Jemima! It certainly makes me ponder ethical issues in such developments. Although I think that if technological boundaries can be pushed to this limit, humans will always attempt to do so, even when our ethical guidance suggests we shouldn’t. Thanks for sharing

Liked on YouTube: Could you live without a smartphone? | Anastasia Dedyukhina | TEDxWandsworth

https://youtu.be/uNQujCwCu88 Anastasia Dedyukhina ditched her smartphone, together with her senior international career in digital marketing, when she realized how dependent she had become on the gadget. Today she acts as a business mentor, supporting ethical tech startups, and runs Consciously Digital, helping companies and individuals be more productive and less stressed in an age of digital distraction. In her talk, Anastasia will explain why we feel the uncontrollable urge to check our smartphones all the time and share the valuable lessons she learned and the tips that helped her find the balance between her online and offline life.

Having worked for 12+ years in senior digital marketing positions for global media and internet brands, and easily spending 16 hours a day connected and even sleeping with her phone, Anastasia eventually realised she needed to unplug to remain healthy and productive.
Giving up her smartphone was the first step to creating Consciously Digital – a London-based training and coaching company that helps individual and corporate clients be more productive online, so that they can have more time for things that matter.
Anastasia is a frequent speaker at global internet conferences on the topics of ethical tech and digital detox, as well as marketing in the age of digital distraction. She blogs for Huffington Post about digital detox, and is currently finalising her first book on the same subject. Anastasia was born in Russia, lived in six different countries, and has an MBA from SDA Bocconi (Italy) and NYU Stern (USA), and a PhD from Moscow State University.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

It identifies the fine line educators must tread between the advantageous application of technology as a tool to enhance learning, against the often dangerous pitfalls and losses, that its use (and overuse) may result in.

As a teacher in secondary education, this video has resonated with me. Youngsters are increasingly viewing mobile technology as extensions of themselves, and as suggested by Miller (2009) have ‘achieved an intimacy with their users that other technologies have yet to match’

@Irene72767440 Interestingly, this was the same Olympics that, now disgraced runner, Oscar Pistorius took part in the main race. Critics argued that his prosthetic ‘blades’ gave him an unfair advantage over his able bodied competitors!

@Irene72767440 Absolutely, there’s no doubt the intention of the campaign was to was to emphasise strength of character rather that physical enhancement. That said, its difficult not to reflect on the idea of homo faber, the maker and user of technology, and the resulting symbiosis that occurs

Simmel (1971)… characterised the human desire to manipulate inorganic matter and create tools and machines as a way of overcoming bodily boundaries and limitations in the pursuit of physical transcendence’.

Just read Vincent (2011) ‘The Body and Information Technology’. Fascinating stuff… My father received a cochlear implant in 2010. A means of using technology for to ’normalise’ his condition. I have never viewed him as a cyborg until now https://t.co/w1QKVUFQni