Michael saved in Pocket: ‘The Californian Ideology’ (Barbrook and Cameron 1995)

Thank you to JB Falisse for introducing me to this resource!

Excerpt

‘At the end of the twentieth century, the long predicted convergence of the media, computing and telecommunications into hypermedia is finally happening. Once again, capitalism’s relentless drive to diversify and intensify the creative powers of human labour is on the verge of qualitatively transforming the way in which we work, play and live together. By integrating different technologies around common protocols, something is being created which is more than the sum of its parts. When the ability to produce and receive unlimited amounts of information in any form is combined with the reach of the global telephone networks, existing forms of work and leisure can be fundamentally transformed. New industries will be born and current stock market favourites will swept away. At such moments of profound social change, anyone who can offer a simple explanation of what is happening will be listened to with great interest. At this crucial juncture, a loose alliance of writers, hackers, capitalists and artists from the West Coast of the USA have succeeded in defining a heterogeneous orthodoxy for the coming information age: the Californian Ideology.

This new faith has emerged from a bizarre fusion of the cultural bohemianism of San Francisco with the hi-tech industries of Silicon Valley. Promoted in magazines, books, TV programmes, websites, newsgroups and Net conferences, the Californian Ideology promiscuously combines the free-wheeling spirit of the hippies and the entrepreneurial zeal of the yuppies. This amalgamation of opposites has been achieved through a profound faith in the emancipatory potential of the new information technologies. In the digital utopia, everybody will be both hip and rich. Not surprisingly, this optimistic vision of the future has been enthusiastically embraced by computer nerds, slacker students, innovative capitalists, social activists, trendy academics, futurist bureaucrats and opportunistic politicians across the USA. As usual, Europeans have not been slow in copying the latest fad from America. While a recent EU Commission report recommends following the Californian free market model for building the information superhighway, cutting-edge artists and academics eagerly imitate the post human philosophers of the West Coast’s Extropian cult. With no obvious rivals, the triumph of the Californian Ideology appears to be complete.’

View full article

Michael is listening to ‘The machine always wins: what drives our addiction to social media’ by The Guardian’s Audio Long Reads on Spotify

The machine always wins: what drives our addiction to social media
By The Guardian’s Audio Long Reads

Listen on Spotify


This podcast (a recording of a Guardian “long read” article) came to mind when commenting on Charles’s ‘algorithmic play’ artefact.

While it has a fairly dystopian title, it talks about social media, addiction and behaviourist psychology, and seemed quite apt while reflecting on the following from Knox et al. (2020: 41):

‘It may be useful to term these shifts in the conceptual and functional understandings of learning ‘machine behaviourism’: forms of learning that are shaped and conditioned by a combination of complex machine learning technologies with (often radical) behaviourist psychology and behavioural economics.’

Michael commented on Charles’s EDC lifestream – ‘Week 9 – Algorithmic Play Artefact’

Week 9 – Algorithmic Play Artefact

Michael Wolfindale:

Really interesting explorations, Charles, and great screenshots/videos and commentary!

It’s fascinating how you reflect on how our devices may be “listening in” on us. The numerous reports in the media about this have terrified me somewhat.

It’s also interesting how the algorithm picked up on your MOOC-related and other activity on this course, making assumptions about your interests, but then this all appeared to be mixed with seemingly random results. Yet, as you point out, YouTube is perhaps not even concerned with that concerned with relevance but that people become addicted and drive traffic to the service (so “engagement” statistics can be sold to advertisers). This reminds me of a podcast/article I heard/read a while back, which argues that crude behaviourist psychology is deployed, comparing it to gambling, which perhaps relates to the ‘machine behaviourism’ discussed in the Knox et al. (2020) paper.

Great artefact, lots of food for thought – thank you!

Michael commented on Val Muscat’s EDC lifestream – ‘Algorithmic Play Artefact’

Algorithmic Play Artefact

Michael Wolfindale:

What a great artefact, Val! So much rich detail around the different algorithmic systems you explored, insightful commentary and really well presented – great visuals!

It’s particularly fascinating to see you reflect on how the models of these algorithmic systems might be applied to education, potentially attempting to “transform” it according to the “Silicon Valley model”.

You also mentioned Coursera, which was the main focus for my artefact. It was perhaps unclear there why courses were being pushed to the top of the list, whether there was someone sponsoring the ranking, and/or whether number of enrolments was a significant factor in the ranking. Certainly, there was a prevalence of “tech” courses in any case, and disciplines which perhaps tend to lead to higher earnings. All of this is perhaps concerning – arguably, just because a Netflix movie is less “watched” or makes less money doesn’t make it any less valid, and the same applies to educational courses. However, you can imagine in an educational setting the courses that “lead to higher earnings”, have higher student numbers and so on being pushed to the top in this model as you mention. This is something we can already see to a degree in university rankingsalthough perhaps framed by a more explicit political agenda. Whereas, the corporate Silicon Valley model (and commercial/political agendas) often seems to be “hidden” underneath slick user interfaces and the problematic notion of “making the world a better place” for all.

Great artefact and very thought provoking – thank you!

Michael commented on Jiyoung Kwon’s EDC lifestream – ‘This is my algorithmic play with youtube’

This is my algorithmic play with youtube, I hope you all have enjoyed this activity! https://t.co/zvHmW3n2Ue #mscedc

Michael Wolfindale:

Fantastic artefact, Jiyoung, and loved the way you presented it through a scrolling timeline (which reminded me somewhat of the feeds we’re infinitely scrolling through!).

Really interesting how you mentioned ‘expansion with no direction’. It reminded me of a post on the Twitter blog praising a change to their own algorithm which resulted in more “engagement” (more tweets/retweets):
https://blog.twitter.com/official/en_us/a/2016/never-miss-important-tweets-from-people-you-follow.html

Supposedly, this is ‘great for everyone’, where perhaps what they mean is it is great commercially, for promoting these figures to their advertisers and so on. Perhaps their pursuit of expansion is motivated predominantly by profit rather than the interests of those using Twitter, all framed in a Silicon Valley notion of ‘progress’. These commercial motivations are a danger affecting learning analytics which you highlight well.

Great work – thank you for an insightful and well presented artefact!

Michael commented on Jon Jack’s EDC lifestream – ‘Algorithmic Play’

Algorithmic Play

Michael Wolfindale:

Brilliant artefact, Jon! Such insightful thoughts to frame your explorations around – ‘the idea of an algorithm as a cultural presence’ (Beer 2017) and the notion of ‘the algorithm as embodied’. Also, some interesting and, at times, amusing results! Perhaps reflecting the current polarisation of views we so often hear about (e.g. ‘Boris Johnson is a toe rag/genius’ being two of those results)!

I did a very small amount of playing with the Google search autocomplete with ‘is edtech…’, and it seemed to largely promote a profit-driven view, rather than a particular focus on education. Brings to mind how sometimes certain views or biases can be reinforced, having looked at Noble’s (2018) ‘Algorithms of Oppression’.

With regards to Spotify, that’s a fascinating insight into how they quantify music around terms like “speechiness” and “valence”…not terms I’d normally think of! I often find my musical tastes are quite unpredictable, while Spotify tries to play “more of the same”. I wonder if this is accounted for in the algorithm, and would be the same for one that picked academic papers for us to read?

Insightful thoughts and detailed explorations presented really clearly – thank you!

Michael commented on Monica Siegenthaler’s EDC lifestream – ‘Algorithm Play’

Algorithm Play

Michael Wolfindale:

Great artefact, Monica, and love the way you’ve laid out your reflections, Q&A and conclusions in Prezi! Very timely too, particularly with all the ongoing events currently.

It’s fascinating how you framed your artefact around the notion of media shaping us and our identities, and also in the context of education how algorithmic systems might lead to a ‘child led curriculum’. It’s also really interesting how the algorithm might be seen in the context of the classroom, and the human-machinic relations occurring there. It reminds me a little of this article which talks about the role of AI in medicineand all made me reflect on where the power and agency lies in all of this. Really thought provoking – great work!

Michael commented on Iryna’s lifestream blog – ‘My algorithmic play artefact’

#mscedc My algorithmic play artefact: https://t.co/n2L6TTwenI

Michael Wolfindale:

Great artefact, Iryna! Really like the way you have made use of ThingLink with the audio commentary, and the map to plot the location-specific results.

Fascinating comment on the ‘goggles’ shaping our behaviour and results, and the slightly unnerving quote from Jonathan Rosenberg about how shaping ‘customer’ behaviour is perhaps entirely intentional (even though Google’s search is arguably presented itself as objective or neutral. It brings to mind my own explorations of Coursera where, through the ‘algorithmic play’ activity (and a glance of the privacy policy) my perception of the site was completely changed. Yet, how many algorithmic systems do I continue to use unthinkingly on a daily basis?

Really nice artefact, and so clearly presented and thought provoking!

Michael commented on JB’s EDC lifestream – ‘block 3 artefact – my Eurafrican Youtube algorithmic play’

block 3 artefact – my Eurafrican Youtube algorithmic play

Michael Wolfindale:

Fantastic and insightful artefact, JB, and really like the way you have published it as a screencast presentation alongside your commentary, allowing us to see your conclusions alongside visuals of your explorations.

As you say, examples of racism and discrimination being generated/reproduced through YouTube and other algorithmic systems is not very difficult to find, but the (perhaps more subtle) assumptions being made about aspects such as identity, shared accounts etc. that you point out is such an important point. During my own explanations of Coursera and other “apps” during this block, I’ve often felt that the way they have been designed creates so many exclusions through these kinds of assumptions, and that a democratic design process that includes a diverse range of voices is so important yet sadly rare.

I’ve been following the notion of platform cooperativism, which may be one possible approach to address this.

Thank you for bringing up such important points, and really enjoyed your artefact!

Michael commented on Teaching@DigitalCultures (David Yeats) – ‘Algorithmic play artefact : teaching@digital podcast’

Algorithmic play artefact : teaching@digital podcast:

Michael Wolfindale:

Fantastic artefact, David!

Really enjoyed both the podcast (some great sounds there!) and the text/screenshot commentary – really insightful. I’d played around with SoundCloud very briefly at the beginning of the block, and glad that you’ve provided such detail framed around Jeremy’s questions. It was also great to be able to engage with your findings and discussion through different mediums!

It’s really interesting how you comment on the potential privileging of a ‘commercial ethic’ in the algorithmic systems of SoundCloud and others, perhaps shaped by other people, and how perhaps SoundCloud might lose out commercially to others such as Spotify due to its lack of immediacy. You also mention how this might all influence algorithmic cultures in education, and I picked up these familiar commercial/competitive models in Coursera too.

> ‘One thing that has stood out for me in the efforts of platforms to personalise our experience by means of recommender algorithms is how this cultural turn influences what is expected of the services of educational institutions’

Absolutely – this is something that I felt while exploring the Coursera recommendation algorithms, tweaking my profile, and also glancing through the privacy policy. The language there was revealing – where ‘Content Providers’ (presumably academic institutions?) make ‘Content Offerings’ to ‘learners’. Presumably, activity on the site from myself and other ‘learners’ will have some influence on what is offer in future, or what is considered commercially viable.

> ‘This may also mark the impact of algorithmic cultures on education. The expectation of immediate adaptation, flexibility and personalisation.’

This is a really interesting point, and again one which I was reflecting on looking at Coursera. Thinking about how the expectations and assumptions of myself and others might have been shaped by Coursera’s algorithmic systems – and how this in turn may have been influenced by commercial algorithmic systems outside of education – does make me feel a little uneasy!

Great work – really enjoyed it!