Disentanglement from my lifestream: wrapping up algorithmic cultures and EDC 2020

Entanglement
‘Entanglement’ (ellen x silverberg, Flickr)

As I disentangle myself from my lifestream feeds, and reflect on the course, I consider how I have perceived and been influenced by the algorithmic systems involved.

Google and Twitter were consistent influences, the latter through new/existing connections and via #mscedc#AlgorithmsForHer and #ds106, and I saved/favourited (often highly ranked) resources to Pocket, YouTube and SoundCloud (and other feeds).

While I had some awareness of these algorithms, alterations to my perception of the ‘notion of an algorithm’ (Beer 2017: 7) has shaped my behaviour. Believing I “understand” how Google “works”, reading about the Twitter algorithm and reflecting on ranking/ordering have altered my perceptions, and reading about ‘learning as “nudging”‘ (Knox et al. 2020: 38) made me think twice before accepting the limiting recommendations presented to me.

Referring to the readings, these algorithmic operations are interwoven with, and cannot be separated from, the social context, in terms of commercial interests involved in their design and production, how they are ‘lived with’ and the way this recursively informs their design (Beer 2017: 4). Furthermore, our identities shape media but media also shapes our identities (Pariser 2011). Since ‘there are people behind big data’ (Williamson 2017: x-xi), I am keen to ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25), uncover ideologies, commercial and political agendas (Williamson 2017: 3) and understand the ‘algorithmic life’ (Amoore and Piotukh 2015) and ‘algorithmic culture’ (Striphas 2015) involved.

During my ‘algorithmic play’ with Coursera, its “transformational” “learning experiences” and self-directed predefined ‘learning plans’ perhaps exemplify Biesta’s (2005) ‘learnification’. Since ‘algorithms are inevitably modelled on visions of the social world’ (Beer 2017: 4), suggesting education needs “transforming” and (implied through Coursera’s dominance of “tech courses”) ‘the solution is in the hands of software developers’ (Williamson 2017: 3) exposes a ‘technological solutionism’ (Morozov 2013) and Californian ideology (Barbrook and Cameron 1995) common to many algorithms entangled in my lifestream. Moreover, these data-intensive practices and interventions, tending towards ‘machine behaviourism’ (Knox et al. 2020), could profoundly shape notions of learning and teaching.

As I consider questions of power with regards to algorithmic systems (Beer 2017: 11) and the possibilities for resistance, educational institutions accept commercial “EdTech solutions” designed to “rescue” them during the coronavirus crisis. This accelerated ‘datafication’ of education, seen in context of wider neoliberal agendas, highlights a growing urgency to critically examine changes to pedagogy, assessment and curriculum (Williamson 2017: 6).

However, issues of authorship, responsibility and agency are complex, for algorithmic systems are works of ‘collective authorship’, ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver 2013: 8-10). As ‘processes of “datafication” continue to expand and…data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4), I return to the concept of ‘feedback loops’ questioning the ‘boundaries of the autonomous subject’ (Hayles 1999: 2). If human-machinic boundaries are blurred and autonomous will problematic (ibid.: 288), we might consider algorithmic systems/actions in terms of ‘human-machinic cognitive relations’ (Amoore 2019: 7) or ‘cognitive assemblages’ (Hayles 2017), entangled intra-relations seen in context of sociomaterial assemblages and performative in nature (Barad 2007; Introna 2016; Butler 1990) – an ‘entanglement of agencies’ (Knox 2015).

I close with an audio/visual snippet and a soundtrack to my EDC journey

 

My EDC soundtrack:

My EDC soundtrack cover image


View references

Resisting through alternative spaces and practices: some initial thoughts

Image by John Hain from Pixabay
Image by John Hain (Pixabay)

‘If politics is about struggles for power, then part of the struggle is to name digital technologies as a power relation and create alternative technology and practices to create new spaces for citizens to encounter each other to struggle for equality and justice.’ (Emejulu and McGregor 2016: 13)

Much of this block on algorithmic cultures has involved us examining the algorithmic systems and cultures at play in our lives, ‘unpack[ing] the full socio-technical assemblage’ (Kitchin 2017: 25) and uncovering ideologies, commercial and political agendas (Willamson 2017: 3).

As I explore power and (the notion of) the algorithm (Beer 2017), and the complexities of agency with regards to these entangled human-machinic relations (Knox 2015; Amoore 2019; Hayles 1999), and its implications for my lifestream here and education in general, I increasingly wonder what form resistance to this might take.

With this mind, I have gathered some rough notes and links below into the form of a reading list of sorts; this is something I hope to explore in the future.

Collating a reading list…

Protest and resistance

A Guide for Resisting Edtech: the Case against Turnitin (Morris and Stommel 2017)

‘Students protest Zuckerberg-backed digital learning program and ask him: “What gives you this right?”‘ (Washington Post 2018)

Why parents and students are protesting an online learning program backed by Mark Zuckerberg and Facebook (Washington Post 2018)

Brooklyn students hold walkout in protest of Facebook-designed online program (New York Post 2018)

Hope (2005) outlines several case studies in which school students have resisted surveillance.

Tanczer et al. (2016) outline various ways in which researchers might resist surveillance, such as using “The Onion Router” or tor (although the tor project website may be blocked for some).

Noiszy offers a browser extension which aims to mislead algorithmic systems by filling sites of your choosing them with “noise”, or “meaningless data”.

#RIPTwitter hashtag, often used to resist changes to Twitter’s algorithmic systems. Noticed earlier in the course, when considering changes to algorithms that may have an effect on my social media timelines. See DeVito et al. (2017).

‘Integrated & Alone: The Use of Hashtags in Twitter Social Activism’ (Simpson 2018). Examines viral hashtags associated with social movements: #metoo, #takeaknee, #blacklivesmatter.

Alternative models (such as platform cooperativism, participatory democracy/design and inclusive codesign)

‘There are people behind big data – not just data scientists, but software developers and algorithm designers, as well as the political, scientific and economic actors who seek to develop and utilise big data systems for their diverse purposes. And big data is also about the people who constitute it, whose lives are recorded both individually and at massive population scale. Big data, in other words, is simultaneously technical and social.’ (Williamson 2017: x-xi)

What/who might we resist? Surveillance, ‘datafication’ and data-intensive practices that discriminate against the marginalised (Noble 2018; Knox et al. 2020)? ‘Learnification’, neoliberal ideologies and the marketisation of education (Biesta 2005)? “Technological solutionism” (Morozov 2011; 2013)?

Looking a little deeper into the models and processes those following the “Silicon Valley” model often evangelise about, reveals the Agile Manifesto (Beck et al. 2001), written by what appears to be an group (calling themselves ‘The Agile Alliance’ and described as ‘organizational anarchists’) of people meeting at a Utah-based ski resort in 2001.

David Beer (2017: 4) argues that ‘algorithms are inevitably modelled on visions of the social world, and with outcomes in mind, outcomes influenced by commercial or other interests and agendas.’ Yet can we imagine a future where these agendas – rather than based on a Silicon Valley ethos, or as Barbrook and Cameron (1995) call it, Californian ideology (thank you JB Falisse for introducing me to this) – are rooted in cooperative or participatory principles?

‘Cooperative creativity and participatory democracy should be extended from the virtual world into all areas of life. This time, the new stage of growth must be a new civilisation.’ Imaginary Futures book (Barbrook 2007)

Alternative business models rooted in democracy, such as platform cooperativism (see Scholz and Schneider 2016).

Alternative design processes, rooted in participation and inclusivity, such as participatory design (see Beck 2002) and inclusive codesign:

‘The importance of inclusive codesign has been one of the central insights for us. Codesign is the opposite of masculine Silicon Valley “waterfall model of software design,” which means that you build a platform and then reach out to potential users. We follow a more feminine approach to building platforms where the people who are meant to populate the platform are part of building it from the very first day. We also design for outliers: disabled people and other people on the margins who don’t fit into the cookie-cutter notions of software design of Silicon Valley.’ (P2P Foundation 2017)

What might this look like in the context of education?

Can Code Schools Go Cooperative? (Gregory 2016)

Looking into critical pedagogy (see Freire 2014 [1970]; Freire 2016 [2004]; Stommel 2014), and If bell hooks Made an LMS: Grades, Radical Openness, and Domain of One’s Own (Stommel 2017).

As we see signs that “EdTech” companies stand to potentially gain from or exploit the current coronavirus crisis…

…it is ever more urgent to consider the potential significant effects that these “EdTech solutions” may have on educational policy and pedagogies (Williamson 2017: 6). As Williamson (2020) writes:

‘Emergency edtech eventually won’t be needed to help educators and students through the pandemic. But for the edtech industry, education has always been fabricated as a site of crisis and emergency anyway. An ‘education is broken, tech can fix it’ narrative can be traced back decades. The current pandemic is being used as an experimental opportunity for edtech to demonstrate its benefits not just in an emergency, but as a normal mode of education into the future.’ (Williamson 2020)


View references

My EDC soundtrack

My EDC soundtrack cover image
‘My EDC soundtrack’ cover artwork

Here is a soundtrack to mark my journey through EDC 2020…

Liner notes

The tracks on my playlist happen to be almost exclusively without lyrics (with the odd exception), with a number of relatively long tracks over ten minutes. When reading and writing, I often find lyrics distracting (unless I know them well and can “zone out”), and quick changes to track and to pace can throw me off.

After becoming entangled with my lifestream feeds, the opening tracks reflect the utopian/dystopian oppositions we discussed in the cybercultures block, and how the distinction made between human and machine is in fact Blurred – in contrast to the dualisms we often came across during the course. Thus, the ‘boundaries of the autonomous subject’ and concept of ‘autonomous will’ is problematic (Hayles 1999).

Moving onto the community cultures block, and exploring the connectivist-informed ds106, inspired my next choice – Everything Connected by Jon Hopkins. Questioning the open/closed false binary, that “open” is not necessarily “inclusive” (Collier and Ross 2017: 8-9) and the way I sometimes felt feelings of seclusion even in “open” spaces influenced the following few tracks.

Exploring algorithmic cultures in our final block, I listened to Dan Tepfer who experiments with algorithmic processes in his compositions, such as Fractal Tree. The idea of fractals and feedback loops continued to fascinate me as I reflected again on Hayles’ (1999: 2). In particular, her discussion of feedback loops (in the context of the cyborg) problematising the boundaries of the autonomous subject prompted me to reflect on the complexities of considering agency in the context of algorithmic systems. Fractals and feedback loops were the inspiration for the next few tracks, as well as for my lifestream-blog header image – the Mandelbrot set, a visualisation created through feedback and iteration.

In collating my final summary posts, and beginning to write my assignment, the writing process was a messy one. Often I found myself writing improvising ideas onto a page, but getting stuck (demonstrated through a “false start” on Miles Davis’ Freddie Freeloader). Yet, inspired by the quote often attributed to Miles Davis – ‘do not fear mistakes – there are none’ – I tried to embrace dead ends as part of the process rather than “mistakes”. In the end, I found myself needing quiet time to reflect, as demonstrated through John Cage’s 4’33 and a number of tracks centred around themes of silence and peace. Once I had collected my thoughts, I would make use of The Eraser and return to Page One in order to move forward.

Finally, I found myself thinking about the issues of resistance (to surveillance, to datafication, to commercial and political agendas entangled with algorithmic processes in education) – and listening to The Protest by Flying Lotus – but all too suddenly found myself at the end of the lifestream, wishing farewell and disentangling myself from my feeds…

Cover artwork credits


View references

Reflecting back on EDC 2020

As I come to the end of the lifestream blog, I return to questions and aspects I considered early on in the course…

EDC week 3
EDC week 3 (enlarge)

‘Entanglement’ has been a key theme throughout the course – the entangled ‘boundaries of the autonomous subject’ (Hayles 1999: 2), reconsidering the dualisms and false binaries which have increasingly appeared entangled – human/machine, real/virtual, open/closed, public/private and so on…

Dualisms visual artefact
Dualisms visual artefact

Rather than assume I could remain an impartial observer, I became entangled in my “object” of research – the ds106 communities and networks

Miro board
Miro board

Finally, my ‘algorithmic play’ artefact – rather than examine standalone and discrete “black boxes” of code, instead revealed ‘massive, networked ones with hundreds of hands reaching into them’ (Seaver 2013: 10)…

Photo by Federico Beccari on Unsplash
PhotoFederico BeccariUnsplash.

…multiple codebases entangled with one another, with collective authorship but tangled up with a “Silicon Valley ethos”, commercial interests, neoliberal ideologies and specific notions of “progress”…

‘Algorithmic systems: entanglements of human-machinic relations’
‘Algorithmic systems: entanglements of human-machinic relations’

…a messy and complex entanglement of multiple algorithmic systems and human-machinic cognitive relations (Amoore 2019: 7), algorithmic systems with ‘a cultural presence’ (Beer 2017: 11), both modelled on and influenced by ‘visions of the social world’ (ibid: 4).

Contemplating on how we might think about agency in this context encourages me to consider the ‘broader debates about the status of agency as processes of “datafication” continue to expand and as data feeds-back into people’s lives in different ways’ (Kennedy et al. 2015: 4).

This “feeding back” of data into people’s lives in turn brings me back to the concept of ‘feedback loops’ which question the ‘boundaries of the autonomous subject’ (Hayles 1999: 2) and loops me back to the beginning of the course, when I put up my lifestream blog header image (the Mandelbrot set), a visualisation itself created through feedback and iteration

The Mandelbrot set
The Mandelbrot set (created by Wolfgang Beyer with Ultra Fractal, CC BY-SA 3.0).

View references

Week nine and algorithmic systems: unpacking the socio-technical assemblage

Photo by Federico Beccari on Unsplash
Algorithmic systems as ‘massive, networked [boxes] with hundreds of hands reaching into them’ (Seaver: 2013: 10)? (PhotoFederico BeccariUnsplash.)
Having just published my artefact Algorithmic systems: entanglements of human-machinic relations, I have started commenting on others’ artefacts, finding insightful (but sobering) explorations of the algorithmic systems of FacebookNetflix and others. Facebook’s foray into ‘personalized learning platforms’ and the resistance against it is very relevant as I reflect on the issues of surveillance and automation in education (Williamson 2019).

While focusing on Coursera for my artefact, I observed inclusions/exclusions, tendencies to privilege “tech” courses from Western institutions, and experienced the complexities of researching algorithmic systems (Kitchin 2017). Does Silicon Valley-based Coursera – with its vision of life transformation for all – subscribe to Silicon Valley’s notion of ‘progress’ while blind to its exclusions? Is it another example of an attempt to ‘revolutionise’ education, as Williamson (2017) details, framed by neoliberal and commercial agendas yet presented as an objective and universal ‘learning experience’?

I reflect on the ‘black box’ metaphor, yet ‘algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them’ (Seaver 2013: 10) and we ‘need to examine the work that is done by those modelling and coding [them]’ (Beer 2017: 10). Thus, rather than stripping individual algorithms of their wider social and political context, we should ‘unpack the full socio-technical assemblage’ (Kitchin 2017: 25) and examine the complex ‘human-machinic cognitive relations’ (Amoore 2019: 7), ‘entanglement of agencies’ (Knox 2015) and the implications for education in an era of ‘datafication’ (Knox et al. 2020).


View references

‘Algorithmic play’ artefact – ‘Algorithmic systems: entanglements of human-machinic relations’

‘These algorithmic systems are not standalone little boxes, but massive, networked ones with hundreds of hands reaching into them, tweaking and tuning, swapping out parts and experimenting with new arrangements…we need to examine the logic that guides the hands…’ (Seaver 2013: 10)

Early on this block, I played with a range of different algorithms, looking at the recommendations served up by my existing activity. I have created a “recommendations feed” for my lifestream, and also made general notes and screenshots in number of posts.

Later on, and in thinking specifically about digital education, and tying the activity with our MOOC exploration, I first looked at my FutureLearn recommendations, and finally, focused on Coursera – where I was able to provide false data for my “profile” and “learning plan” and see the alterations in recommended courses.

Some initial conclusions

Many of the “recommendation engines” I played with, such as SoundCloud, Spotify and Twitter, led me into what I perceived to be a “you loop“, “filter bubble” or “echo chamber”. Google’s autocomplete showed some signs of reproducing existing biases, perhaps amplifying dominant views held by other users. Google may also be making assumptions based on data they hold about me, which may have skewed YouTube search results, although it would be interesting to compare results with other users or in different locations, an approach Kitchin (2017: 21) discusses. I have written a little about ethical concerns and my Google data also.

Moving my focus to Coursera, in its recommendations they appeared to privilege courses with a information technology/computer science focus, although the range of available courses is clearly a factor too. In any case, the founders’ background in computer science, and education at a Western university, shone through despite attempts to tweak my “profile” and “learning plan” (I have written a little about the “profile” ethical issues also). This appears to be a common theme in “apps” or websites” developed by Western companies, and the design processes utilised (whereby products are developed first for the “profile” of the founder(s), and secondarily for others) arguably creates exclusions for some (and a strong sense of “this wasn’t designed for me”) and inclusions for others (notably switching my profile to “software engineer” produced a wealth of “relevant“ results which I could tailor to my choosing).

My artefact: Algorithmic systems: entanglements of human-machinic relations

You can see the Coursera ‘algorithmic play’ posts in a lifestream feed, and also through this CodePen (which brings in the same feed but presents, ranks and orders it slightly differently). How might consuming the feeds through different ranking systems affect your perception, and how might the pre-amble surrounding it have changed your conclusions and what you click on?

View the CodePen

‘Algorithmic systems: entanglements of human-machinic relations’
‘Algorithmic systems: entanglements of human-machinic relations’

You can also access the “code behind” the CodePen. However, even though access is granted to this code, it will not necessarily be easy to make sense of it (Kitchin 2017: 20; Seaver 2013). While presented as ‘open’ source, is it really ‘open’ and who for? What exclusions might this create?

The CodePen (while slightly misleadingly called an “Educational Algorithm”) is a very basic attempt to present the feed of Coursera recommendations in a different fashion for comparison, while provoking some thought around algorithmic systems in general. There is no complex processing, just a little reordering. It also does not store any data (information entered is used within the browser to rank articles, but not stored in an external database), and the text descriptions are fictional – it is just for visual/demonstration purposes.

Reflections on the CodePen

Inspired by Kitchin (2017: 23), I have written a few words reflecting on my experiences, while acknowledging the limitations of the subjectiveness of this approach.

The CodePen was adapted (“forked“) from an existing CodePen, which provided the majority of the base upon which (with my very limited skills!) I could tweak very slightly to add a basic feed of the Coursera recommendations by pasting in another bit of code (which turned out to be old and broken) and then ultimately another bit of example code (this is from a Google-“led” project). It is very much a case of different bits of code “hacked” together, full of little “bugs” and “glitches” and not particularly well designed or written! I was keen to strictly limit the time spent on it, although I know much time could be spent tidying and refining it.

Presumably, similar time constraints, albeit with more resources/testing, affect development of, say, Facebook/Google etc. algorithms and lead to mistakes. After all, Facebook’s internal motto used to be ‘move fast and break things’, although this race to create a “minimum viable product” and “disrupt” (regardless of the consequences) is increasingly criticised.

In any case, this CodePen is like a snapshot of an experimental “work in progress” (which others are welcome to “fork”, use and adapt) and brings to mind the messy, ontogenetic, emergent, and non-static nature of algorithmic systems (Kitchin 2017: 20-21).

Ultimately, my aim is to raise some questions about some details of this process and, since most of the code was “stuck together” from bits of free and ‘open’ source code, how the culture of the individuals/teams is significant too. As Seaver (2013: 10) puts it…

‘…when our object of interest is the algorithmic system, “cultural” details are technical details — the tendencies of an engineering team are as significant as the tendencies of a sorting algorithm…’

…and, given that a large portion of the code came from a Google-led project (and Twitter too), how might the tendencies of those teams have created inclusions/exclusions? Furthermore, as the focus of my artefact is Coursera, whose founders both have experience working at Stanford University and Google, how might the tendencies there have guided Coursera and, subsequently, my CodePen?

Finally, given that Coursera presents itself as a universal educational space, where its vision is ‘a world where anyone, anywhere can transform their life by accessing the world’s best learning experience’, what are the implications of this for digital education? In my initial explorations, my perceptions are that computer science and information technology disciplines from Western universities are prioritised by Coursera, in general and through their algorithmic systems. However, further research is needed to, as Kitchin (2017: 25) puts it, ‘unpack the full socio-technical assemblage’.


View references

Further play with Google autocomplete

There appear to be some fairly binary options about technology being “good” or “bad”, and dominant ideas of ‘success’, presented here… could this be mainly influenced by what others have searched? Or by a prevalence of articles supporting these positions?

'Is technology...' Google autocomplete
‘Is technology…’ Google autocomplete
'How to succeed...' Google autocomplete
‘How to succeed…’ Google autocomplete

Week eight: critically playing with algorithms

This week, while commenting on micro-ethnographies, I began the ‘algorithmic play’ activity, adding new lifestream feeds including Vimeo and Deezer and, inspired by ‘Show and Tell: Algorithmic Culture’ (Sandvig 2014) and Noble’s (2018) Algorithms of Oppression, played with Google search algorithms including their autocomplete

'Is edtech...' Google autocomplete‘Is edtech…’ Google autocomplete

I also discovered (some) of what Google “knows” about me, collected ‘algorithmic play’ notes/screenshots and recorded algorithm ‘recommendations’ from…

I reflect on questions from Amoore (2019: 7)

‘Do algorithms compute beyond the threshold of human perceptibility and consciousness? Can ‘cognizing’and ‘learning’ digital devices reflect or engage the durational experience of time? Do digital forms of cognition radically transform workings of the human brain and what humans can perceive or decide? How do algorithms act upon other algorithms, and how might we understand their recursive learning from each other? What kind of sociality or associative life emerges from the human-machinic cognitive relations that we see with association rules and analytics?’

…and, as I explore these ‘human-machinic cognitive relations’, look beyond the polished “app” user interfaces and reflect on how algorithms (despite how they are presented) are far from objective or neutral (Kitchin 2017: 18). I turn my attention to investigating discrimination and bias (Noble 2018 and #AlgorithmsForHer)…

I also investigate the notion of ‘data colonialism’ (Knox 2016: 14), rethink the relation between algorithms and power (Beer 2017) and look to the future of what this might all mean in an educational context (Knox et al. 2020; Williamson 2017).


View references

What might be the effect of ordering and ranking our feeds or timelines?

In browsing through my Facebook and Twitter feeds today, and reflecting on the relation between power and algorithms (Beer 2017), a friend posted on Facebook a Guardian article asking ‘Why don’t we treat the climate crisis with the same urgency as coronavirus?’

Immediately below, a post was displayed from another friend (from different circles) linking to an article from Business Insider about the importance of hand washing and the coronavirus.

It’s difficult to tell why the algorithm placed these posts next to each other – whether there was a connection between the two, or whether both were deemed independently likely to exhibit some kind of ‘positive reaction’ (or reaction of another sort) from me. However, it did make me reflect on how initially – to a degree – these posts felt (at least to me) “pitched” against/at odds with one another (by being placed in close proximity) and exhibited initial reactions that I might not have otherwise felt had I seen the posts independently.

Twitter’s algorithmic restructuring of timelines

Turning my attention to Twitter, this kind of algorithmic restructuring of timelines, at times using “deep learning”, initially caused controversy and an #RIPTwitter “campaign”.

@mjahr‎ talked about these early efforts on the Twitter blog…

‘…when you open Twitter after being away for a while, the Tweets you’re most likely to care about will appear at the top of your timeline…’

(@mjahr‎ on Twitter blog, 2016)


…and went on to praise their “success”…

‘We’ve already seen that people who use this new feature tend to Retweet and Tweet more, creating more live commentary and conversations, which is great for everyone.’

(@mjahr‎ on Twitter blog, 2016)

 

Furthermore, in this HootSuite blog post, criticism of the Twitter algorithm was downplayed somewhat, stating that ‘the algorithm drove more engagement from users’.

Yet what does ‘engagement’ mean, how do they know what we ‘care about’ and how can they possibly prove that it is ‘great for everyone’? Driving traffic and increasing usage may be ‘great’ for both Twitter and HootSuite – companies who attempt to derive profit in one way or another from such usage – however, this kind of biased uncritical language is perhaps all too common in some circles (such as those who are profiting through advertising or sale of associated products).

Much has been written about the questions and issues that algorithms such as these raise, not least the relationship between political rhetoric and social media; for example, by Oliver (2020). I continue to read, explore and critically reflect, particularly pondering over what this might mean in an educational context (such as our own use of Twitter during this course)…

Week seven: Researching communities…interactions between entities or entangled intra-relations?

As we conclude our block on community cultures, and I post my micro-ethnography artefact Entangled Communities, many questions/issues have been raised.

Inspired by David Yeats’ artefact grappling with a community apparently “present” but “hidden”, I pondered on how/whether this might be tracked and issues of surveillance that link to our next algorithmic cultures block. His artefact also asks ‘what is community?‘, and I wondered how we might define it…

  • a ‘creative “gathering”‘ (Bayne 2015b: 456) around a ‘shared domain of interest’ (Wenger 1998; Lave and Wenger 1991)?
  • a feeling ‘produced by more-than-human assemblages’ (Hickey and Moody 2019: 2)?

While researching, should we focus on a network of ‘connections between entities’ (Siemens 2005) or on agential relations and ‘intra-actions’ where agency is co-constitued (Barad 2007; Hickey and Moody 2019: 4-5)?

As I constructed/traversed a network of connections (Downes 2017) in the connectivist-informed ds106, “I” and “my study” (including my field notes) became “entangled” in the course/community I was studying and my artefact itself appeared increasingly like a tangled network map of connectionsI noted the course/community boundaries blurring and the traditional MOOC form questioned.

Entangled Communities
Entangled Communities

Questioning my research methods, I explored various approaches including the speculative method (Ross 2017)

…rather than an “observer” collecting data about something “out there”, are researchers entangled with the “object” of research where data generated/collected ‘is co-created by the fieldwork assemblage’ (Hickey-Moody and Willcox 2019: 5)?

Finally, as I listened to ds106 radiois sound a ‘vibrational event’, and listening an embodied experience (Ceraso 2018)?

On that note, I’m experimenting with a short audio snippet to conclude:


View references