Covid-19 Narratives and Content Governance in a Diverse Platform Ecosystem
By Brendan Scally, Research Assistant in the IRC funded Platforming Harm project, and Eugenia Siapera, UCD Centre for Digital Policy Member & Head of School of Information & Communication Studies. May 2022.
Throughout the Covid-19 pandemic there has been a significant deterioration in trust between health authorities and certain sections of society as misinformation has proliferated online. The main policy response has been to stop the erosion of trust through preventing the circulation of misinformation around Covid-19. To do this, platforms have utilised a toolkit of various interventions to help moderate the spread of such content. This includes labeling misinformation, handing out strikes, downranking borderline content, up ranking authoritative sources, demonetization, and finally the removal of accounts either temporarily or permanently. These interventions have been effective in containing the bulk of Covid-19 related misinformation but have also triggered unintended side-effects, namely the ‘migration’ of these accounts and their contents to ‘Alternative Tech’ platforms, such as BitChute, Parler, Gab and Telegram (Innes and Innes, 2021). These platforms adopt a minimalist approach to moderation, leaving many types of potentially harmful information to circulate freely. Concerns have arisen at the potential of these sites to act as “spaces of withdrawal and regroupment” where, isolated from the mainstream, communities can become increasingly radicalized (Rauchfleisch & Kaiser, 2021). Given these concerns, it is pertinent to ask, what is the effect of this ‘migration’ and more broadly of the existence of a parallel digital public sphere where, it seems, everything goes? This question animates our research project Platforming Harm: Alt Tech Platforms and Harmful Health Narratives, funded under the IRC Coalesce scheme. We report here some initial findings drawing on a digital ethnography of Covid-19 protest groups and channels on Telegram.
Telegram
Telegram is a messaging and broadcasting platform which claims to have advanced security features alongside a light-touch approach to content moderation. Such features enable it to effectively respond to the extremist’s dilemma as they attempt to navigate their dual need for publicity and privacy (Rogers, 2020). Throughout the Covid-19 pandemic, Telegram has hosted a variety of content ranging from mild reticence regarding public health measures to outright misinformation and conspiracy. Numerous Irish groups have emerged, ranging in membership from a couple dozen to several thousand. Topics of discussion run the gamut of health misinformation casting doubt on various public health interventions and even the very existence of Covid-19 itself. Vaccination, masking, lockdowns, and testing are frequently decried as ineffective and even dangerous. To evidence these claims, users rely on a combination of emotive language and graphic imagery. In conjunction with this sensationalist content, interviews with controversial ‘experts’ are posted frequently so that both an emotive and logical case can be made. As well as acting as a repository of diverging discourses regarding public health, Telegram also appears to be used as a tool for mobilization and activism, with many groups explicitly formed to coordinate protests and gatherings. This combination of archival, communicative, and organisational features makes Telegram a particularly versatile tool for its users and pose important questions regarding platform governance in an increasingly diverse platform ecosystem.
A Conspiratorial Worldview
In our research, we focused on Covid-19 protest groups, conducting both a qualitative digital ethnography and a computer assisted content analysis using topic modelling. We are still in the process of analysing our data, but we share here some preliminary observations. What we observed among some of these groups on Telegram can be thought as the construction of a complete worldview, defined by distrust and conspiracy. Contents tend to contextualise and explain relatively understandable or relatable concerns in terms of outlandish claims of micro-chipping and apocalyptic visions of population control. This is not entirely unexpected as previous research has identified that belief in one conspiracy tends to correlate with belief in multiple such theories (Douglas et al, 2019). Many of the groups formed ostensibly to discuss Covid-19 have devolved into generalized discussions of current events, from a conspiratorial perspective. Mainstream institutions, such as the media and universities are depicted as hotbeds of leftism. Doctors, journalists, and politicians are denounced as elitist gatekeepers of knowledge, condescending to the people. Meanwhile, the media figures promoted by users of Telegram are valorised as renegade actors, speaking defiantly against a corrupt system. This dynamic makes sense when one takes into consideration the fact that belief in conspiracies correlates with low educational attainment, low income, and socio-economic marginalization (Douglas et al, 2017). Many of the individuals caught up in these discourses seem to be employed in professions where remote working is not possible and will have seen their quality of life diminish. Meanwhile, those insulated from such troubles justify disruptive measures often in jargonistic language impenetrable to those without advanced health literacy and summarily narratives of scepticism or reticence. Given this unequal sharing of burden, it is understandable that grievances will be voiced in a forum which many users regard as their safe space, isolated from the mainstream where they will likely experience ridicule and condemnation for voicing their frustrations.
Some Telegram groups on Covid-19 also host extremist, far-right discourses. These are most visible in the predominantly American groups on Telegram, though there are notable examples in Irish groups as well. Ranging from libertarianism to ethnonationalism, these discourses repeat and promote white, Christian, and aggressive forms of hypermasculinity. Progressive views and particularly globalism are reviled as poisoning society and systematically removing individual rights, under the guise of public health, and transferring power to large supranational bodies, such as the World Health Organization. Women, Jews, refugees, LGBT+ communities and other minoritised communities are frequently targets of vitriol. The siloed and self-selected nature of these groups coupled with almost no moderation makes them fertile grounds for the proliferation of hate speech. Those who may have initially joined to hear alternative perspectives regarding Covid-19 are then subjected to extremist discourse with minimal counter-speech. While it is easy to sneer at the more outlandish claims on Telegram, this would be misguided. Belief in conspiracies can be indicative of a profound confusion and alienation from reality. Alternative platforms such as Telegram provide their users a space to articulate deep anxieties about the present and future, thereby meeting important psychological needs. Our initial observations of these Telegram channels seem similar to research findings into the ‘incel’ (‘involuntary celibate’) online subculture, which is seen as providing an important source of identity, community and solidarity for its members, often alienated young men (Cottee, 2020).
Platform Policies
Whilst moderation practices such as deplatforming or removing contents may be necessary to control the spread of misinformation, the increasingly diverse platform ecosystem raises doubts on the effectiveness of such policies. While research indicates that deplatforming from the main social media platforms diminishes the audience for far right content, hate speech and misinformation (Rauchfleisch & Kaiser, 2021), the fact remains that such information still exists and circulates in other parts of the platform ecosystem. In this ecosystem, mainstream platform policies such as deplatforming some accounts or ‘de-amplifying’ content may only be displacing content rather than actually eradicating it. This is because platform policies on content are concerned with two main issues: defining what counts as problematic content, and secondly, controlling its circulation. But in doing so, they pay no attention to the contexts of production and use of problematic content. Effectively addressing misinformation requires that production, circulation and consumption, use and reception of such content be examined. By understanding the framing of Covid-19 on Telegram it is hoped that our research will shed light on what feeds into the production of relevant frames and narratives, how various users receive them and how such content circulates across different platforms. Ultimately, we are hoping that our research can contribute to identifying policies that can not only restore trust between health officials and the public, but empower users to seek genuinely informative content and truly supportive connection.
Acknowledgement: This work has been supported by the IRC, project ID COALESCE/2021/39