Week 9 Review

Week 9 was the second week of our algorithmic culture block and week of our algorithmic play artefact, located here. In the artefact I discuss how Instagram makes suggestions for me based on my search history, my demographic, and my previous likes.

It was also the week we had to leave work and set up home offices; as IT support staff that meant 12 hr days for myself and colleagues to get through the calls for support from faculty and staff in all aspects of online teaching. As my classmate Sean has mentioned though, the global situation is a game changer for our sector, leading to huge opportunities.

In a roundup of this week’s blog additions, I pulled in some posts related to my Instagram artefact. As the posts on ‘the AI generator of fictitious faces’- the post shows how easy it has become for AI to generate an influencer type photograph that will get lots of instagram ‘likes’, based on social norms for ‘what is considered attractive’ – 10,000 fake faces were created by the algorithm over several hrs.

In addition the posts on Instagram’s use of AI here and here, demonstrate how technology can no longer be seen as the passive instrument of humans, but it is being used as an agent by humans and non humans to “nudge towards the predetermined path as designed by the algorithm” Knox et al 2020.

Seeing as there is an “increased entanglement of agencies in the production of knowledge and culture” (Knox 2015), it is very hard to drill down to see who is managing or benefiting from the behavioural governance of these algorithms. This is particularly pertinent to education and the influencing factors on young people. From a privacy or discriminatory point of view, Educational transcripts, unlike credit reports or juvenile court records, are currently considered fair game for gatekeepers like colleges and employers, as noted in this post, and information from these can be used to implement unfair bias against potential scholarship entrants or employees.

Algorithmic code is influenced by “political agendas, entrepreneurial ambitions, philanthropic goals and professional knowledge to create new ways of understanding, imagining and intervening” in our decision making (Williamson 2017)

Therefore we should maintain a critical perspective on algorithmic culture and continue to question the “objectivity and authority assumed of algorithms” (Knox 2015), and how they are shaping, categorising and privileging knowledge places and people. I’m glad to see the that there is a growing awareness of algorithms that appear to discriminate against people based on age, race, religion, gender, sexual orientation or citizenship status, as noted in the development of  an ‘accountability bill’ in the US. Culpability can be difficult to prove in these cases due to the way that companies remain very secretive about how their models work, but it is a start.

References for the readings

Kitchin, R. (2017) Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Knox, J. 2015. Algorithmic Cultures. Excerpt from Critical Education and Digital Cultures. In Encyclopedia of Educational Philosophy and Theory. M. A. Peters (ed.). DOI 10.1007/978-981-287-532-7_124-1

Knox J., Williamson B., & Bayne S., (2020) Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice

Doubts About Data-Driven Schools

As we all know, the collection of data on students has become commonplace, from grades, state test resumescores, attendance, behavior, lateness, to graduation rates. Each year however, this information becomes ever more detailed with more and more data points being collected per student.

Educational transcripts, unlike credit reports or juvenile court records, are currently considered fair game for gatekeepers like colleges and employers. These records, though, are getting much more detailed, and how will this influence a student’s ability to gain access to these institutions in later years?

Article Complet : https://www.npr.org/sections/ed/2016/06/03/480029234/5-doubts-about-data-driven-schools
via Pocket

New York City Moves to Create Accountability for Algorithms — ProPublica

This article discusses the development of an ‘accountability bill’ in the US, which aims to punish different races of peoplecompany algorithms that appear to discriminate against people based on age, race, religion, gender, sexual orientation or citizenship status.

Whilst it would be a good law in theory, in reality, it would be very hard to implement. Algorithmic systems are so difficult to break down, designed to be increasingly complex, gathering millions of data points and  “woven together with hundreds of other algorithms to create algorithmic systems” (Williamson 2017). Added to that,companies are very secretive about how their models work and what type of parameters make up the design of the algorithms, so that “the rules generated by (algorithms) are compressed and hidden”  (Williamson 2017). Discrimination would be very hard to prove as being deliberate in these cases, but hopefully it will encourage scientifically sound data builds, validated in appropriate ways, and to eventually make them more transparent to the public.

Article Complet : https://www.propublica.org/article/new-york-city-moves-to-create-accountability-for-algorithms
via Pocket

Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice

Instagram explains how it uses AI to choose content for your Explore tab

Not a huge amount of detail was handed over by Instagram in the writing of this article, but I did glean one thing;

Instagram identifies accounts that are similar to one another by adapting a common machine learning method known as “word embedding.” Word embedding systems study the order in which words appear in text to measure how related they are. Instagram uses a similar process to determine how related any two accounts are to one another. If it thinks this is similar to an account you’ve already liked, they’ll recommend it to you.

There are no details on what signals are used to identify spam or misinformation. The algorthim details are not revealed. So whilst algorithms are playing an increasingly important role in producing content and mediating the relationships between us and other internet products, precisely how they do that is not made clear to us,

“Such conclusions have led a number of commentators to argue that we are now entering an era of widespread algorithmic governance, wherein algorithms will play an ever-increasing role in the exercise of power, a means through which to automate the disciplining and controlling of societies and to increase the efficiency of capital accumulation.” Kitchin 2017

 

Article Complet : https://www.theverge.com/2019/11/25/20977734/instagram-ai-algorithm-explore-tab-machine-learning-method
via Pocket

Reference
Kitchin, R. (2017) Thinking critically about and researching algorithms, Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Instagram will be better about showing you new pictures first

Instagram used to sort posts in a simple reverse order chronologically- where the most recent were

at the top. Then they changed the algorithm to sort posts with family and friends first. This annoyed a lot of users who found that a family post from a week ago took priority over a post from 30 mins ago.

I wonder if the algorithm was picking up on the increased amount of user clicks and scrolls as the frustrated users moved around their feed trying to ensure that they were ‘all caught up’ on latest feeds? Despite user complaints, Instagram have not changed it back, and maybe it’s because the new algorithm forces users to stay on the app for longer.

It ranks three main factors when creating users’ feeds: interest, recency, and relationship. Interest is how much Instagram thinks you’ll care about a post, with the most important obviously coming to the top. Recency just means Instagram prioritizes newer posts, and your relationship to the poster is of course also considered.

Article Complet : https://www.theverge.com/2018/3/22/17151976/instagram-chronological-new-photos-algorithm-feed-new-post-button-update
via Pocket