Coursera recommendations (based on false data) – what inclusions and exclusions are apparent?

I am experimenting with inputting false information about myself in Coursera, in order to see the difference in algorithmic recommendations. Here is how I described myself…

False data provided to Coursera
False data provided to Coursera

… and here are some recommendations provided after entering the above data…

Recommendations provided by Coursera
Recommendations provided by Coursera

The top listed courses are exclusively technology-based and “offered by” Google, and appear to have no direct connection to my listed industry “Health and Medicine”…

While my explorations were very limited here, in some ways this seems fairly consistent with my experiences of using certain (but not all) MOOC or educational course/video sites (and even more general “apps”). As soon as you step outside of the area of computer science, the range of courses is more limited, despite the sites themselves being presented as general educational sites. In looking to change my “learning plan” options (which change your profile and recommendations) revealed the “default” or “suggested” text, presented before you enter your own profile options:

Setting your Coursera "learning plan"
Setting your Coursera “learning plan”

You can see the results of my profile/”learning plan” alterations here. However, at this stage of deciding my profile options, the “software engineer” who works in “tech” seems to be the “default” starting point here. This is all perhaps no surprise given that Coursera was set up by Stanford computer scientists; as often seems the way, the developers build something for themselves (ensuring a seamless user experience for their own circumstances) and then only later branch out.

One example outside of education here is the online bank Monzo, whose early customer base was ‘95% male, 100% iPhone-owning, and highly concentrated in London’ (Guardian 2019). This description mirrors the co-founder Tom Blomfield, as he himself admits:

‘Our early customer was male, they lived in London, they were 31 years old, they had an iPhone and worked in technology. They were me. I’ve just described myself. Which has huge advantages, right? It’s very easy to know what I want.’ (The Finanser 2017)

While Monzo does claim to have a focus on social inclusion (This is Money 2019), why is this always seemingly secondary to building the app, gaining users (similar to themselves) and getting investors on board? Should social inclusion, whereby apps are designed for all users in a democratic fashion where everyone has a say, not be inherent in the very beginning planning, design and development processes? There may be a place here for considering platform cooperativism, inclusive codesign and participatory design approaches here (see Beck 2002; Scholz and Schneider 2016; West-Puckett et al. 2018).

Coming back to education, if Coursera have taken a similar approach as Monzo to designing their platform and building up their catalogue of courses, it is perhaps concerning that who do not mirror the designers and developers may be left excluded and on the margins.

Conversely, an inclusive codesign approach may have produced different results. As Trebor Scholz (P2P Foundation 2017) explains:

‘The importance of inclusive codesign has been one of the central insights for us. Codesign is the opposite of masculine Silicon Valley “waterfall model of software design,” which means that you build a platform and then reach out to potential users. We follow a more feminine approach to building platforms where the people who are meant to populate the platform are part of building it from the very first day. We also design for outliers: disabled people and other people on the margins who don’t fit into the cookie-cutter notions of software design of Silicon Valley.’
Trebor Scholz (P2P Foundation 2017)

Leave a Reply

Your email address will not be published. Required fields are marked *