The tweaks to my Coursera profile and learning plan have had a fairly limited effect so far on my Coursera recommendations.
I notice my recently viewed courses have an impact, so will look to alter this:
First, I am switching my profile and learning plan to a nurse in the healthcare industry:
…courses that others identifying themselves as nurses now appear…
Notably, the first is “nursing informatics” – could this be another example of information technology dominating results?
I view some courses related to ‘Everyday Parenting’, ‘Mindfulness’, ‘Well-Being’ and ‘Buddhism and Modern Psychology’ and ‘Social Psychology’.
Below some more computer science/information technology degree recommendations…
There are some courses displayed on ‘Personal Development’. Many are not particularly related to the areas I specified, however it is a rare opportunity to see recommended courses that are not computer science or information technology.
My explorations seem to again show a privilege towards computer science subjects – again, not surprising given the background of the founders.
However, this limited focus does seem slightly at odds with Coursera’s own slogan:
‘We envision a world where anyone, anywhere can transform their life by accessing the world’s best learning experience.’
(About Coursera)
Hi Michael, that’s an interesting artefact as you have chosen an online Plattform for digital learning. I enjoyed reading your results and interpretation in which you kind of dismantled the algorithms weaknesses from an educational perspective. There are many findings in your artefact(s) worth discussing about but two issues were sticking out to me.
First the limitation of the algorithm due to external circumstances. Even though you have changed your profile and learning plan and started to search for other areas then tech, programming, etc., the recommendations showed you tech related content. Kind of similar to my experience with Netflix constantly showing me action and big blockbuster Hollywood movies even though I had no such movies in my list and watched recently. In your case it is – as you possibly correctly stated – the background of the Coursera developers platform and in my play the Mexican viewers preferences which might have prevent the algorithm to show specific contents based on personal information, searches and interests.
Directly connected to that is your comparison with the key slogan of Coursera offering its members transformation possibilities. At the same time reacting to a personal preference with a limited selection of the whole Coursera course portfolio. This is not a general problem if we assume that the typical learner/user already knows what she/he wants to learn and has entered this in the learning plan as they receive a selection of the appropriate courses. But what about those not fitting into this construction of a modern day autonomous learner as it is well described in “Machine behaviourism: future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies by Jeremy Knox, Ben Williamson & Sian Bayne?
They are not recognized and therefore left out in this algorithm design. So basically the algorithm itself is hindering transformation.
Thank you very much for your comments, Thomas! Great points, and the design of the Coursera platform and algorithmic systems have a tendency, as you say, to privilege the notion of the ‘self-directing learner’, as discussed by Biesta (2005) and Knox et al. (2020). This all perhaps privileges specific ideologies – such as Silicon Valley cultures, neoliberal and commercial agendas and so on – which is perhaps similar to your experiences with Netflix where certain content appeared privileged over others despite there apparently being no “personalised” reason to do so.
It’s been really interesting to see what aspects of Coursera chime with “apps” such as Netflix in this activity, and discuss what the hidden ideologies and motivations might be – thank you again for your comments!