For my little piece on Algorithmic Play, I decided to look at the Instagram AI and how it assesses my likes and dislikes. And from there, how it suggests to me, people and businesses to like in the future. I focused mainly on the “people feed” of my Instagram account.
The link to the short (7min) presentation is here.
I covered how Instagram is owned by Facebook and that they share algorithmic code between both companies, as well as user contacts and profiles. I discuss how Instagram makes suggestions for me based on my search history, my demographic, and my previous likes. I note how paid subscriptions are also entering my feed and being cross referenced with demographic and search history, and how all of of this demonstrates an“increased entanglement of agencies in the production of knowledge and culture” (Knox 2015)
Also attached is the transcript of the audio presentation below.
I really enjoyed your presentation. It was so clear showing how every webpage we use, can change the results in our accounts recommendations. And in the same time, it raises the concern about in what extend the algorithm will affect our social life.
Hi Adrienne,
You really went into depth in your presentation about the possible triggers that might have influenced your Instagram feeds. I was surprised at some of the connections you made which make so much sense, especially the link to other platforms providing cookies/trails for Instagram to pick up and even the use of other devices which could have led to the suggestions that came up. Thank you for your insight. I always (perhaps naively) believed that corporations stuck to their data gathered sources but imagine if all major platforms shared data between them…that would explain so many things.
Great artefact and I agree with Valerian, it was really interesting to see that friend suggestions might come from other applications then just Instagram itself. Meaning that I was not surprised to find out about Insta search behavior, demographics and also friends of friends suggestions but I was surprised that suggestions might also come from google search, IP based tracking of online search behavior and cross and even your podcast app! This inter-tech company entanglement is a crucial outcome of your research!
The concern about who is managing and befitting from a specific algorithm design is so valid when we think about AI based learning LMS. It is important to research the applications, programs, systems we or the learners are working with.
Great artefact, Adrienne, and really like the annotated screencast format – works brilliantly with the subject matter!
Fascinating (and perhaps sobering!) how you pinned down aspects which were influenced by activity outside of your Instagram account, such as your search history. Reminds me of a quote from Seaver (2013: 10) I came across which builds on the “black box metaphor” to argue that ‘these algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them’:
Seaver, N., 2013. Knowing Algorithms. Presented at the Media in Transitions 8, Cambridge, MA. Available from: https://static1.squarespace.com/static/55eb004ee4b0518639d59d9b/t/55ece1bfe4b030b2e8302e1e/1441587647177/seaverMiT8.pdf [Accessed 11 Mar 2020].
It’s great that you’ve discussed multiple algorithmic systems and the wider context in that sense since, as you also note, it is all too often the case that a handful of big tech companies own many of these different “apps” (such as Facebook owning Instagram). As you draw on from Williamson (2017), the business models, entrepreneurial cultures and commercial and political agendas are all a huge factor here. The Silicon Valley model and associated ideologies are aspects that also came to the fore in my brief “play” with Coursera. Really clear and thorough conclusions here!
Great work – really enjoyed it!
Hi Adrienne,
Really enjoyed how much you were able to expose the workings of algorithms across major platforms.
I think my key realisation from what you said and the references you drew on from Knox, Williamson and Kitchin was that if you really look, the work that algorithms do is clearly visible in our consumption of culture now.
The entanglement of the human and non-human in generating our own digital culture is clear to see. Even if we don’t understand the underlying science of algorithms, their behaviour and impact can’t be ignored.
You made excellent connections and analysed what you saw in your suggestions very carefully. The analysis of the impact of demographic data caught my attention too. This is where an argument can be made for the motivations behind algorithm design. On the one hand we could say, the system works to benefit us because there is just too much data for any one individual to wade through in order to find something they like or something that is relevant to them.
On the other, there is the fact that this creates an echo chamber of ideas and people who think and look like the ‘customer’ using the algorithm. Thus, as we have seen in recent history it becomes much easier to politically manipulate vast swathes of a particular demographic without any oversight or awareness from the regulatory systems.
Great artefact, I’m glad I’ve finally had time to look at it.
Thanks!