Week 8-Algorithms for everyone

Pavstud

 

It would have been very handy to design an algorithm to filter my most inspired posts on this blog from the more run of the mill ones. On the other hand, this could prove futile in a blog aimed at documenting my train of thought throughout Education and Digital Cultures. Algorithms are as much about filtering out ‘undesired’ data as about whitelisting user choices.

There were two main arguments running in tandem through the posts this week. The right for the informed public to have access to algorithms behind some of the most popular social media platforms and how algorithms in education can either help or destroy notions of learning. Since algorithms are ‘adjudicating more and more consequential decisions in our lives’ (Diakopoulos, cited in Kitchin, 2017) and they are essentially capitalist in nature, one has to question who they are serving. Their chimaeric nature made up of many networked ‘hands’ (Seaver, cited in Kitchin, 2017) is perhaps why studying their effect is not straightforward. Yet algorithms feed-in human ingenuity or lack of knowledge about them and so need to be ethically managed.

Algorithms and AIEd is also a field of education that is often contested because of a return to a behavioural approach to learning. Perhaps this might not be the Pavlovian route where learners are given instant gratification but more of a consumerist perspective that monitors learning to collect data and tailor effective learning solutions through positive behaviour. Reinforcement learning and nudging are perhaps two of the most effective ways to shape learning. Not only are technologies shaping learning but more often than not they are shaping humans to act like machines, thereby stripping them of their autonomy by negating them access to what is being filtered out.

The ‘learner’ is now an irrational and emotional subject whose behaviours and actions are understood to be both machine-readable by learning algorithms and modifiable by digital hypernudge platforms.  (Knox et al, 2020)

References:

Kitchin. R., (2017) Thinking critically about and researching algorithms,
Information, Communication & Society, 20:1, 14-29, DOI: 10.1080/1369118X.2016.1154087

Knox, J., Williamson, B., & Bayne, S., (2020) Machine behaviourism:
future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Do algorithms make us behave?

 

This video explains the concept of reinforcement learning in machines and gives some very good examples by showing how the algorithm behind reinforcement learning continuously compares particular actions (responses) into the machine engine (in this case a game). When a positive result is achieved and a reward is given, the set of steps leading to that reward is saved. This keeps going on in order to accrue as many positive behaviours as possible. When the concept of reward is not that straightforward in that the steps to get to a reward are much more complex, reward shaping and adding more rewards for every scenario is possible (although time-consuming). Training without rewards is very hard in reinforcement learning, a technique which closely echoes the behavioural learning patterns of early educational systems.

The idea of algorithmic systems that pepper student learning with occasions for enjoying reward (as in the case of easy quizzes in MOOCs) may act as the carrot before the donkey in order to promote the self-directing learner while providing an occasion for ‘datafication’ and collection of data (Williamson, 2017). In this case, student behaviour becomes a very ‘valuable commodity’ (Knox et al, 2020) in providing the ‘action to the state’ as explained in the video because it can help predict outcomes. Ironically students are then providing their behaviour patterns for free to the users of CMSs, VLEs and MOOCs.

not only is data positioned before the desires of the learner as the authoritative source for educational action, but the role of the learner itself is also recast as the product of consumerist analytic
technologies. (Knox et al, 2020)

Educational systems that study and collect data in order to provide ‘the best possible learning experience’ and ‘limit’ the online learner to a simple reward system are an example of Biesta’ s concept of ‘learnification’, whereby the system is merely interested in producing successful students and growing numbers of successful students. This kind of ‘solutionism’ is a far cry from the learning process envisaged by Biesta. (Biesta, 2012). The social dimension of education is absent as a starter and learning is reduced to the concept of playing a basic video game (like Pong) in which the reward rather than the playing experience is what ultimately counts, reducing the learner to the idea of a ‘product’ (Rushkoff, cited in Knox et al, 2020). This is a view deeply enshrined in radical behaviourism and a concept built upon the binary determinism of computer systems that are able to break down responses to knowledge into a system of ‘ons’ and ‘offs’ that will eventually (even thanks to the development in quantum computing) challenge or even outperform the best human minds as seen below.

References:

Biesta, G., (2012). Giving Teaching back to education: Responding ot the disappearance of the teacher. Phenomenology & Practice 6 (2)pp 35-49.

Knox, J., Williamson, B., & Bayne, S., (2020) Machine behaviourism:
future visions of ‘learnification’ and ‘datafication’ across humans and digital technologies, Learning, Media and Technology, 45:1, 31-45, DOI: 10.1080/17439884.2019.1623251

Williamson, B. 2017. Introduction: Learning machines, digital data and the future of education (chapter 1). In Big Data and Education: the digital future of learning, policy, and practice. Sage.

Week 4- Reflections on MOOCs

The development of MOOCs has been somewhat criticised for a number of reasons, for lack of a ‘real instructor’ or for the considerable amount of drop-outs. Perhaps the lack of human contact or the possibility of taking on a course far from an institution does affect the way education is perceived as was discussed in a number of articles posted this week. Is ‘openness’ as observed by Knox (2013) really a liberatory concept or do online learners still feel the need for an institution behind their learning?

In the course of this week, I came to wonder how far MOOCs can be seen as another form of ‘cultural commodity’ (Lister, 2009). Most MOOCs make use of videos, audio and other media which could fall well into the model described by Lister whereby production is focused on the creation of services for profit. This might determine who studies for ‘free’ for personal satisfaction and who pays for a certificate in order to improve the chances of a better career.

I also tried to determine what economic models MOOCs follow. Do they encourage ‘free’ learning to advertise high numbers of those taking on a particular course or is there some other form of discreet advertising going on? O’Reilly (cited in Lister, 2009) predicted that development in Web2.0 would not follow the path of manufacturing better hardware but by an increase in the provision of paid data or data that can be acquired according to need. We might already be paying for that free course by leaving data trails whenever we access the course platform and other companies may already be paying for that data to enhance their online courses.

And now to the micro-ethnography…

 

References:

Lister, Martin … [et al.], (2009) “Chapter 3. Networks, users and economics” from Martin Lister … [et al.], New media: a critical introduction pp.163-236, London: Routledge

Knox, J., (2013). Five critiques of the open educational resources movement. Teaching in higher education, 18(8), pp.821-823.

What’s Wrong With MOOCs, and Why Aren’t They Changing the Game in Education?

This an article that appeared a while ago as the structure of MOOCs has advanced considerably but it does question at one point the number of dropouts in MOOCs. One of the reasons seems to be the lack of a ‘live instructor’

It does point out though the ‘economics are on the side of MOOCs’ and the potential of having companies and MOOCs working together to encourage students to enter the workforce. This is not something that can happen at once.

This shift will not occur anytime soon, however, because the social pressure to go to college and get a degree still exists. Such pressure results in the ongoing issue of student debt in our country. When this pressure no longer exists, and when economics play a larger role in determining how students receive their education, it is at that point when MOOCs could potentially replace higher education as we know it. (Harman Singh)

Singh’s view does remind of one of the major concerns around MOOCs and TEL. Knox (2020) notes that education is a fertile ground for the culture supporting Silicon Valley enthusiasts and larger companies in search of profit. The interest behind MOOCs, therefore, can hardly be an unadulterated concept.

As Lister (2009) points out, ‘the new networked media has been influenced by commercial interests’ which have shaped the way we live and the ways communities are structured. Singh contends that modern educational communities that are based online do not always have all the elements necessary for the educational experience to be complete.

 

from Diigo https://ift.tt/2v9blah
via IFTTT

References:

H. Singh (). What’s Wrong With MOOCs, and Why Aren’t They Changing the Game in Education? Available at: https://www.wired.com/insights/2014/08/whats-wrong-moocs-arent-changing-game-education/. (Accessed 8th February 2020).

Knox (2020) Introduction to Community Cultures.

Lister, M. et al (2009). “Chapter 3. Networks, users and economics” . New media: a critical introduction pp.163-236, London: Routledge

#mscedc. Can technology solve all of education’s problems? https://t.co/rYNVmc1P6N

 

This gives voice to a common occurrence when trying to bring technology and education together. The idea that technology can ‘fix’ education or ‘enhance’ education ‘ by the operations of an externally applied technology ‘solution’ (Bayne, 2015)  is perhaps one of the most frustrating points of view that both educators and administrators of education have a risk of falling into, and which tends to separate technology from the social practice of learning as explained by Bayne (2015).

This instrumentalist view of technology tends to reduce the application of technology to a ‘fashion’ or ‘trend’ which encourages some of those involved in managing the education process to blindly invest and encourage educators to use technology for technology’s sake.

The recording does mention cliched ideas of technology use and the idea that technology is sometimes a ‘one size fits all’ idea and the universalist view that ‘all humans are essentially the same’ (Knox, 2015). This tends to clash with the more modern AI-driven idea that complex data systems can collect information and provide a more tailor-made solution to individuals. While good charismatic teachers are every department’s dream, this does not mean that technology is not required.

from http://twitter.com/MVJ12518369
via IFTTT

References:

Bayne, S., (2015). What’s the matter with ‘technology-enhanced learning’? Learning, Media and Technology, 40(1), pp. 5-20, https://doi.org.ezproxy. is.ed.uk/10.1080/17439884.2014.915851

Knox, J., (2015). Critical Education and Digital Cultures. Encyclopedia of Educational Philosophy and Theory. Springer, pp. 1-6. Available at: https://doi.org/10.0.1007/978-981-287-532-7_124-1.

Sophgalvin (2019) Digital Media and Education – Why technology can’t fix education. 12th May 2019. Available at: https://soundcloud.com/user-948349027/digital-media-and-education-why-technology-cant-fix-education.