‘…it is worth reflecting on what one means by ‘self learning’ in the context of algorithms. As algorithms such as deep neural nets and random forests become deployed in border controls, in one sense they do self-learn because they are exposed to a corpus of data (for example on past travel) from which they generate clusters of shared attributes. When people say that these algorithms ‘detect patterns’, this is what they mean really – that the algorithms group the data according to the presence or absence of particular features in the data.
Where we do need to be careful with the idea of ‘self learning’, though, is that this is in no sense fully autonomous. The learning involves many other interactions, for example with the humans who select or label the training data from which the algorithms learn, with others who move the threshold of the sensitivity of the algorithm (recalibrating false positives and false negatives at the border), and indeed interactions with other algorithms such as biometric models.’
Machine learning systems are profoundly influenced by the methods of data collections and labelling that are used in their creation. Yet there has been a lack of research into the processes of how training data is constructed and used.
Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to a new paper researchers from MIT and Stanford University will present later this month at the Conference on Fairness, Accountability, and Transparency.
This book came up during the film festival discussions, and it brings up some interesting questions around ‘moral decisions’ being made by ‘machines’. Yet, even writing these words makes me come back again to this quote from Hayles:
This is helping me to further deconstruct and question the idea of ‘autonomous will’, the boundaries of the ‘human subject’ and the notion that agency lies with either ‘human’ or ‘machine’, reflecting instead on this concept of distributed cognition.
In November 2019, Leon Kowalski found himself in the offices of a large corporation in Los Angeles, answering some odd questions. “You’re in a desert. You look down and you see a tortoise…” When the questioner moved on to ask about his mother, things didn’t end well.