Started the week replying to a comment left by Matt on my micro-ethnographic artefact about how he finds thinking about the implications of Learning Analytics (LA) often quite depressing.
I replied half-heartedly that there are some beneficial uses of analytics that aren’t reductionist and gave some examples but given that these examples are few and far between in the landscape of LA, Matt’s overall impression is valid.
Following this I posted an example of Algorithmic play that involved looking at my personalisation of my Google ads. Given that I’ve largely stepped away from Google as a search engine & browser instead moving towards services like Duckduckgo (a privacy focused browser) . The personalisation was 50/50 accurate and may seem fairly innocuous. However, when that is scaled up for open data commerce, we suddenly see scores of copied profiles in systems that we have no oversight nor power over. They’re profiles that seem like you to some extent: DOB, location, race, hobbies, and even a lot more private information.
On March 9, I first started to notice more stories about the use of mobile data to track the whereabouts of people with COVID-19. It reminded me of the very recent protests in the US from students who didn’t want their location tracked using the university WIFI. Now that seems banal compared to the sudden explosion in extremely invasive Tech in Education from video-conferencing software to automated grading tools exploiting the emergency situation.
Seems like this is the catalyst for far-reaching surveillance-model education.
The article “Is Learning Analytics Synonymous with Learning Surveillance, or Something Completely Different?” looks at LA from several angles and asks some challenging questions. Of course, LA relies upon the collection of big data and the use of algorithms to manipulate and present that data. The big question for me coming out of it is, if we as educators see a problem with this and want to stop it, what happens when students turn around and say “no, I’m paying all this money for my education, I expect you to use every possible means to ensure I pass. That includes using the data you collect to provide feedback on my progress and what I need to do to improve”
If this kind of mentality is prevalent then what choice to institutions have but to respond in order to market themselves?
2 posts on March 10 referencing Kin Lane, husband of Audrey Watters, and his experiences of the changing online algorithmic landscape. One is a comment on the actual behaviour of Google: it’s not about high quality content anymore, it’s all about who you know. The other is a speculative fiction which seems too eerily close to reality: auto-correct AI changing the meaning of your text messages and emails in order to bring them inline with company policy.
The rest of the week was spent getting my Artefact up and running, commenting on my classmates’ artefacts and a few random tweets about doing an image search for MSCEDC and reflecting upon the computer modelling that predicted the COVID-19 pandemic months ago.