The tweaks to my Coursera profile and learning plan have had a fairly limited effect so far on my Coursera recommendations.
I notice my recently viewed courses have an impact, so will look to alter this:
First, I am switching my profile and learning plan to a nurse in the healthcare industry:
…courses that others identifying themselves as nurses now appear…
Notably, the first is “nursing informatics” – could this be another example of information technology dominating results?
I view some courses related to ‘Everyday Parenting’, ‘Mindfulness’, ‘Well-Being’ and ‘Buddhism and Modern Psychology’ and ‘Social Psychology’.
Below some more computer science/information technology degree recommendations…
There are some courses displayed on ‘Personal Development’. Many are not particularly related to the areas I specified, however it is a rare opportunity to see recommended courses that are not computer science or information technology.
My explorations seem to again show a privilege towards computer science subjects – again, not surprising given the background of the founders.
However, this limited focus does seem slightly at odds with Coursera’s own slogan:
‘We envision a world where anyone, anywhere can transform their life by accessing the world’s best learning experience.’
As previously discussed, the approach for those from a Western university-educated computer science background to build something for themselves, raise funds through investment but then market it as a “universal” solution that is “best” for all appears quite common.
Are these examples of the algorithm pushing “similar” content and perhaps also changing my perception of what I should listen to? Have I been in a “you loop“? Have my recommendations been influenced by others listening to them?
This video was (algorithmically) added by Vimeo to my feed.
‘Triptych’ is a non-commercial short three-act film that reflects my attempt to visualize the laws and theories of the astrophysics in an art way. Also a few thoughts about the modern art are included in the narration.
I changed my Coursera “learning plan” to indicate that I am a Teacher/Tutor/Instructor in the Education industry, to compare the results with my previous exploration of Coursera.
The results are more varied (and not exclusively focused on software development or the “tech” industry), however there are still various programming, data/computer science and business options presented (despite expressing no preference for this kind of industry):
I am experimenting with inputting false information about myself in Coursera, in order to see the difference in algorithmic recommendations. Here is how I described myself…
… and here are some recommendations provided after entering the above data…
The top listed courses are exclusively technology-based and “offered by” Google, and appear to have no direct connection to my listed industry “Health and Medicine”…
While my explorations were very limited here, in some ways this seems fairly consistent with my experiences of using certain (but not all) MOOC or educational course/video sites (and even more general “apps”). As soon as you step outside of the area of computer science, the range of courses is more limited, despite the sites themselves being presented as general educational sites. In looking to change my “learning plan” options (which change your profile and recommendations) revealed the “default” or “suggested” text, presented before you enter your own profile options:
One example outside of education here is the online bank Monzo, whose early customer base was ‘95% male, 100% iPhone-owning, and highly concentrated in London’ (Guardian 2019). This description mirrors the co-founder Tom Blomfield, as he himself admits:
‘Our early customer was male, they lived in London, they were 31 years old, they had an iPhone and worked in technology. They were me. I’ve just described myself. Which has huge advantages, right? It’s very easy to know what I want.’ (The Finanser 2017)
Coming back to education, if Coursera have taken a similar approach as Monzo to designing their platform and building up their catalogue of courses, it is perhaps concerning that who do not mirror the designers and developers may be left excluded and on the margins.
‘The importance of inclusive codesign has been one of the central insights for us. Codesign is the opposite of masculine Silicon Valley “waterfall model of software design,” which means that you build a platform and then reach out to potential users. We follow a more feminine approach to building platforms where the people who are meant to populate the platform are part of building it from the very first day. We also design for outliers: disabled people and other people on the margins who don’t fit into the cookie-cutter notions of software design of Silicon Valley.’
Trebor Scholz (P2P Foundation 2017)
Signing up for these MOOCs appears to have affected these recommendations fairly significantly, given there are recommended courses in the areas of research, security and programming. However, there appear to be few (if any) courses directly touching on the areas of anthropology and music (which my enrolled courses cover); this may be due to lack of currently available courses although there may be other reasons.
How have other people been involved in shaping results?
It is not clear (at least from this page) how they make the recommendation decisions, but there may well be algorithmic ranking based on sponsorship, course popularity or “staff picks”. Therefore, it’s possible that other students’ enrolments or FutureLearn staff decisions may alter my recommendations.
Do results feel personal or limiting? Is this optimisation, or a‘you loop’?
I don’t think I would normally make use of the explicitly labelled recommendations, however I often make use of the search function which may include similar algorithmic ordering and ranking. The choices here seem fairly limiting, almost persuading me that – in order to be an “expert” – I should study them. There seems to be the assumption that I would choose to study a similar course to one I have studied before, even though in reality I would probably want to look at something completely different.
What might be the implications?
My concern, looking at both the general catalogue of courses and the recommendations (albeit very briefly), is that certain subjects appear privileged over others (there are a great deal of courses on computer programming, for instance). As mentioned above, this may be down to many other factors (such as course availability), however it would be interesting to see how course enrolment numbers impact upon the ranking. I personally would find this a little disconcerting – I wouldn’t want a course that simply has high enrolment numbers to be privileged in my recommendations. As elsewhere in education, just because a course may have lower numbers or generate less money, it doesn’t mean it is any less important.