Week 10 – Algorithmic Play

https://www.timetoast.com/timelines/2214815

Choosing my social media – For my algorithmic play activity, I decided to focus on my most frequently visited social media site – YouTube. I watch a variety of YouTube clips almost every evening as one of my primary sources of news, and I often follow my selected pathway of clips, as recommended by the YouTube algorithms.
That said, I was interested to monitor more closely how the algorithms ‘guide’ me in my recommended viewing and personal decision making, and whether or not, I was actually in the driving seat.

I decided to start with the type of video that I watch almost every day – The View. This is a daytime American talk show, co-hosted by four women, including Whoopi Goldberg. The show is highly political in nature, and discussion amongst the women mostly centres around American politics, and in particular the Trump Administration.

Methodology – Before starting the algorithmic play I decided to delete the watched history, to avoid algorithmic influence from previous YouTube sessions and other videos I had watched.
When the sidebar of recommendations was shown, I opted for clips that piqued my own personal interests and I always chose one from the top six recommended videos. I continued to click through the recommendations for over 30 videos, and I recorded where it took me. The results of this pathway are shown on the Timetoast attached to this blog.

Reflections – It is clear that starting the algorithmic play with a television programme such as the View, resulted a preordained pathway being lain by the algorithms. The View, as mentioned, is highly political with three-quarters of the panel coming from liberal, left-wing backgrounds. The programme has a high degree of ‘Trump-bashing’ and although there is one Republican panellist, Megan McCain, unlike the majority of the Republican Party, she is an ardent outspoken critic of President Trump. The initial clip I watched focused on the Trump Administration’s lacklustre response to the Coronavirus pandemic.
It seems that the tenor and tone of this particular clip was highly influential on the subsequent recommended pathway suggested by the algorithms. Each of the clips that followed were imbued with the following themes:-

• Left-wing liberal news organisation e.g. Vox Media/ CNN/ MSNBC (20 clips)
• Anti-Trump (9 clips)
• Coronavirus Pandemic (7 clips)
• Race relations (4 clips)
• Brexit (2 clips)

At several points the algorithms had restricted my options, limiting what I could see and what I could choose for a period of time. In particular this happened when I selected the first of the Vox media clips. This resulted in being ‘stuck’ with only Vox choices to choose from for another 12 selections.
In order to change the options of the algorithm, I purposely selected a video clip that would create a new direction. This worked, and I was able to ‘escape’ the Vox loop and move onto content created by other organisations, although still within left wing, liberal media.

On reflection, it seems that there was certainly a loop of information, with the algorithms directing me to clips with very similar themes and information. At no time was I directed towards media such as Fox News or other right-wing media groups.

It is clear that with this type of ‘algorithmic power’ or ‘algorithmic governance’, there is threat of algorithms giving rise to confirmation bias in users. This is highly problematic, particularly in a society that is extremely polarised in political opinion.  How can society ably solve problems if it is unable to objectively see the other side of an argument? If algorithms do not show me alternative political opinion, how will I ever be able to understand opposing perspectives? This type of algorithmic echo chamber, is therefore very dangerous. This has congruence with the view put forward in by Rob Kitchin who states that “Far from being neutral in nature, algorithms construct and implement regimes of power and knowledge…. Algorithms are used to seduce, coerce… regulate and control: to guide and reshape how people… and objects interact with and pass through various systems… ” (Kitchin, 2017, 19).

Ethical Issues – There are a certainly some ethical issues to consider. For instance, there would be very little doubt, having seen my list of viewed videos, as to which end of the political spectrum I belonged. Could this data be misused or manipulated? Do my political affiliations no longer hold the same degree of privacy as they had done in the past, now that such data is widely available to large companies and organisations?

Another ethical consideration is how to disentangle between private and professional. As a user of YouTube both at home and at work, it is important to the ensure that the algorithms do not unnecessarily reveal private data and personal preferences in a professional setting, and so ensuring appropriate log ins are used in each of the settings.

References
Kithcin, R. (2017) Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Liked on YouTube: The Truth About Algorithms | Cathy O’Neil

Some key takeaways from this short clip, that have congruence with the themes of the Algorithmic Cultures block.

Cathy O’ Neil argues that algorithms being presented as objective fact is a lie. She says ‘a much more accurate description of an algorithm is that it’s an opinion embedded in math“.  “There’s always a power element here” she adds, and that “every time we build an algortihms, we curate our data, we define success, we embed our values into algorithms.”

 

Liked on YouTube: How algorithms shape our world – Kevin Slavin

Some strong links with the core reading in this YouTube clip that identifies the ‘pervasive’ nature of alogorithms and how they shape and mould every day life.  Similar to what David Beer argues that “algorithmic systems feed into people’s lives, shaping what they know, who they know, what they discover and what they experience.  The power of algorithms here is in their ability  to make choices, to classify, to sort, to order and to rank.” (Beer, 2017, 6)

References

Beer, D. (2017) The social power of algorithms, Information, Communication & Society, 20:1, 1 – 13.

Liked on YouTube: Digital Anthropology Daniel Miller

This YouTube clip was excellent! It really helped me to contextualise the relationship between ethnography and the digital studies. Leading anthropologist Daniel Miller, from University College London,  says ‘the very best way of way of understanding the digital in as much as when we talk about the digital, clearly we’ve got to be interested in the consequences that it has for people” and “that is really why this is the right domain for anthropology.”

 

Liked on YouTube: The Online Community-A New Paradigm: Mark Wills at TEDxSanLuisObispo

 

 

Lister (2009) attempts to define the meaning of ‘online community’ in the context of an environment where ‘the participants are there but not there, in touch but never touching, as deeply connected as they are profoundly alienated’ (Lister, 2009, 209). Thus, this poses a number of challenges, for definitively identifing ‘online community’

Mark Willis’ TEDx aims to show how he believes a real community can be  fostered within an online setting, by focusing attention on 4 criteria for success – longevity, shared values, community management and trust.

Lister, M. (2009) ‘Networks, users and economics’ in New media: a critical introduction, pp163 – 236, London: Routlede

 

 

 

 

Liked on YouTube: “Introduction to communities of practice,” (Wenger-Trayner, 2015)

Saadatdoost et. al posit that “cohesion in a MOOC community is brought about by the domain of doubts, questions, new knowledge, experiences and the community of learners who meet people around the world with similar interests” (Saadatdoost, 2014, abstract). They go on to discussion how this community of practice, as outlined in the clip, has largely been left undiscussed in reference to the study of MOOCs. Interestingly, this clip fails to provide MOOCs as exemplification of where community of practice could be revealed.

References 

Saadatdoost, Robab; Sim, Alex Tze Hiang; Mittal, Nitish; Jafarkarimi, Hosein; and Hee, Jee Mei, “A NETNOGRAPHY STUDY OF MOOC COMMUNITY” (2014). PACIS 2014 Proceedings. 116. http://aisel.aisnet.org/pacis2014/116

 

 

 

Liked on YouTube: Biohacker Explains Why He Turned His Leg Into a Hotspot | WIRED

 

Biohacker Michael Laufer recently had a 512GB drive implanted in his leg, which can store data, stream music or movies, and power a hot spot and mesh network. It’s called the PegLeg, and WIRED’s Daniel Oberhaus spoke with Laufer about the device and the field of biohacking.

For more of Daniel’s reporting on Laufer, his PegLeg and Biohacking technology, visit WIRED.com: https://ift.tt/2HAdH5o

 

 

Liked on YouTube: Experimenting with Biochip Implants

 

Humanity just made a small, bloody step towards a time when everyone can upgrade themselves towards being a cyborg. Of all places, it happened in the back room of a studio in the post-industrial German town of Essen.

It’s there that I met up with biohacker Tim Cannon, and followed along as he got what is likely the first-ever computer chip implant that can record and transmit his biometrical data. Combined in a sealed box with a battery that can be wirelessly charged, it’s not a small package. And as we saw, Cannon had it implanted directly under his skin by a fellow biohacking enthusiast, not a doctor, and without anesthesia.

Called the Circadia 1.0, the implant can record data from Cannon’s body and transfer it to any Android-powered mobile device. Unlike wearable computers and biometric-recording devices like Fitbit, the subcutaneous device is open-source, and allows for the user the full control over the data.

 

Liked on YouTube: Could you live without a smartphone? | Anastasia Dedyukhina | TEDxWandsworth

https://youtu.be/uNQujCwCu88 Anastasia Dedyukhina ditched her smartphone, together with her senior international career in digital marketing, when she realized how dependent she had become on the gadget. Today she acts as a business mentor, supporting ethical tech startups, and runs Consciously Digital, helping companies and individuals be more productive and less stressed in an age of digital distraction. In her talk, Anastasia will explain why we feel the uncontrollable urge to check our smartphones all the time and share the valuable lessons she learned and the tips that helped her find the balance between her online and offline life.

Having worked for 12+ years in senior digital marketing positions for global media and internet brands, and easily spending 16 hours a day connected and even sleeping with her phone, Anastasia eventually realised she needed to unplug to remain healthy and productive.
Giving up her smartphone was the first step to creating Consciously Digital – a London-based training and coaching company that helps individual and corporate clients be more productive online, so that they can have more time for things that matter.
Anastasia is a frequent speaker at global internet conferences on the topics of ethical tech and digital detox, as well as marketing in the age of digital distraction. She blogs for Huffington Post about digital detox, and is currently finalising her first book on the same subject. Anastasia was born in Russia, lived in six different countries, and has an MBA from SDA Bocconi (Italy) and NYU Stern (USA), and a PhD from Moscow State University.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx