https://www.timetoast.com/timelines/2214815
Choosing my social media – For my algorithmic play activity, I decided to focus on my most frequently visited social media site – YouTube. I watch a variety of YouTube clips almost every evening as one of my primary sources of news, and I often follow my selected pathway of clips, as recommended by the YouTube algorithms.
That said, I was interested to monitor more closely how the algorithms ‘guide’ me in my recommended viewing and personal decision making, and whether or not, I was actually in the driving seat.
I decided to start with the type of video that I watch almost every day – The View. This is a daytime American talk show, co-hosted by four women, including Whoopi Goldberg. The show is highly political in nature, and discussion amongst the women mostly centres around American politics, and in particular the Trump Administration.
Methodology – Before starting the algorithmic play I decided to delete the watched history, to avoid algorithmic influence from previous YouTube sessions and other videos I had watched.
When the sidebar of recommendations was shown, I opted for clips that piqued my own personal interests and I always chose one from the top six recommended videos. I continued to click through the recommendations for over 30 videos, and I recorded where it took me. The results of this pathway are shown on the Timetoast attached to this blog.
Reflections – It is clear that starting the algorithmic play with a television programme such as the View, resulted a preordained pathway being lain by the algorithms. The View, as mentioned, is highly political with three-quarters of the panel coming from liberal, left-wing backgrounds. The programme has a high degree of ‘Trump-bashing’ and although there is one Republican panellist, Megan McCain, unlike the majority of the Republican Party, she is an ardent outspoken critic of President Trump. The initial clip I watched focused on the Trump Administration’s lacklustre response to the Coronavirus pandemic.
It seems that the tenor and tone of this particular clip was highly influential on the subsequent recommended pathway suggested by the algorithms. Each of the clips that followed were imbued with the following themes:-
• Left-wing liberal news organisation e.g. Vox Media/ CNN/ MSNBC (20 clips)
• Anti-Trump (9 clips)
• Coronavirus Pandemic (7 clips)
• Race relations (4 clips)
• Brexit (2 clips)
At several points the algorithms had restricted my options, limiting what I could see and what I could choose for a period of time. In particular this happened when I selected the first of the Vox media clips. This resulted in being ‘stuck’ with only Vox choices to choose from for another 12 selections.
In order to change the options of the algorithm, I purposely selected a video clip that would create a new direction. This worked, and I was able to ‘escape’ the Vox loop and move onto content created by other organisations, although still within left wing, liberal media.
On reflection, it seems that there was certainly a loop of information, with the algorithms directing me to clips with very similar themes and information. At no time was I directed towards media such as Fox News or other right-wing media groups.
It is clear that with this type of ‘algorithmic power’ or ‘algorithmic governance’, there is threat of algorithms giving rise to confirmation bias in users. This is highly problematic, particularly in a society that is extremely polarised in political opinion. How can society ably solve problems if it is unable to objectively see the other side of an argument? If algorithms do not show me alternative political opinion, how will I ever be able to understand opposing perspectives? This type of algorithmic echo chamber, is therefore very dangerous. This has congruence with the view put forward in by Rob Kitchin who states that “Far from being neutral in nature, algorithms construct and implement regimes of power and knowledge…. Algorithms are used to seduce, coerce… regulate and control: to guide and reshape how people… and objects interact with and pass through various systems… ” (Kitchin, 2017, 19).
Ethical Issues – There are a certainly some ethical issues to consider. For instance, there would be very little doubt, having seen my list of viewed videos, as to which end of the political spectrum I belonged. Could this data be misused or manipulated? Do my political affiliations no longer hold the same degree of privacy as they had done in the past, now that such data is widely available to large companies and organisations?
Another ethical consideration is how to disentangle between private and professional. As a user of YouTube both at home and at work, it is important to the ensure that the algorithms do not unnecessarily reveal private data and personal preferences in a professional setting, and so ensuring appropriate log ins are used in each of the settings.
References
Kithcin, R. (2017) Thinking Critically about Researching Algorithms. Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087