Too Far Right? The Role of Alt-Tech and Mainstream Social Media Platforms in Far-Right Political Violence Dr Sarah Anne Dunne

 

Too Far Right? The Role of Alt-Tech and Mainstream Social Media Platforms in Far-Right Political Violence

 Dr Sarah Anne Dunne

There is little doubting the sense of déjà vu that came with the recent riots across England and Northern Ireland and the Riots that took place in Dublin city centre last November. From the unquestionable horror of the attacks that targeted children and women in both Southport and Dublin to the use of digital technologies to accentuate and organise around xenophobic rumours to the very rhetoric and violence employed. Some may question why such acts of political unrest are taking place now, particularly as acts of violence against women and children (without trying to sound defeatist or uncaring) routine.

What both attacks undoubtedly point to is not just the reality of political unrest that is being sown by far-right influencers and ideologues across Britain and Ireland, but the further use and even reliance on digital platforms to not only sow such discord, but to organise it.

In their research on the Irish far-right, the Institute of Strategic Dialogue’s Aoife Gallagher and Ciarán O’Connor have highlighted the substantial use of Telegram as a space to both communicate and organise (2021): “Compared to 2019, when a handful of Irish far-right channels posted just over 800 messages, 60,377 messages were posted by 34 channels in 2020” (Gallagher and O’Connor 2021 4). A heavy reliance on alt-tech spaces, such as Telegram, is unsurprising following the mass deplatforming of myriad far-right influencers in the cull after January 6th: mainstream platforms reacted belatedly to the threat of insurrection and electoral disinformation being shared across the USA and globally by banning users who posited electoral fraud, posted QAnon conspiracies or could otherwise be linked to the Capitol takeover.

Riots and Reactions

Online violence, disinformation and harmful narratives have played a key role in the Riots the shook Britain and Ireland in the last year. Both riots sprung not only from consistent anti-immigration narratives and xenophobic/racist discourse but from rumours literally build on these frameworks. Building off of two disparate and violent attacks on children in Dublin (November 2023) and Southport (August 2024), the riots were promulgated by (at the time) unverified rumours of a foreign perpetrator in both cases.

The violence which occurred in both cases was encouraged by the sharing of xenophobic material across social media. In the Irish case, one Telegram user on a Channel named “Kill All Immigrants” sent a voice note encouraging rioters to “bally up, tool up. And any fucking gypo, foreigner anyone, jut kill them. Jut fucking kill them” (Copestake, 2024). In the case of the UK riots, one Facebook user was sentenced to 22 months in prison for sharing incendiary content, including identifying properties which housed migrants (Sion Tootill, 2024) while a user on X called for both mass deportation and arson attacks on properties housing asylum seekers was similarly charged with over 3 years (Murray & Syal, 2024). 

Images from Dublin riots, O’Connel Street 23 Nov 2023

Alt-Tech VS Mainstream

As these examples depict, the riots relied heavily on digital networks to spread their messages. Evidently, they also move with versatility between Alt-Tech and mainstream platforms. The operationalisation and governance of most Alt-Tech platforms make them ideal for extreme and far-right content: many operate with little to no enforced moderation policies, promote absolute free speech to the point of overlooking hate speech completely, and offer anonymity and privacy as standard (Buntain et al 2023; Rauchfleisch and Kaiser 2021; Rogers 2020). Alt-Tech sites have become so reliant on such content that they have been ‘considered synonymous to alt-right, far-right hate speech, and extremist spaces’ (Dehghan and Nagappa 2020 2). It is on these very websites that recent acts of violence and riots have not only been advocated but verifiably organised.

While Alt-Tech spaces seemingly do much of the heavy lifting, mainstream platforms continue the contribute to the dissemination of racist, xenophobic and anti-LGBTQ+ material synonymous with far-right ideology. Indeed, despite massive deplatforming campaigns which came on the heels on Jan 6th, mainstream platforms are continually cited as essential spaces for the fuelling of racist political violence, with Deputy Director of Amnesty Tech, Pat de Brun stating: “In light of the ongoing racist attacks and Islamophobic violence spreading across the UK, we must not overlook how Big Tech platforms offer the far-right a powerful venue to incite hate and organize” (Amnesty International, 2024). Perhaps as disconcerting has been the lack of response from these Tech giants following the riots, despite X/Twitter being highlighted as a primary promoter of the conspiracies and vitriol that helped spark the violence (Fraser, 2024). Indeed, X/Twitter has been cited as the key space for the sharing of Irish anti-immigration sentiment with Telegram coming in second (Gallagher et al., 2023). 

The likes of Facebook, Youtube and Twitter/X have by no means removed all traces of extremist and far-right content. Indeed, so long as they benefit from the traffic such content attracts, there is little benefit for them to do so. Furthermore, when deplatforming policies are employed and content removed, it does not simply disappear. Rather, researchers have argued that deplatforming and content removal only serves to move users to less moderated, more extremist spaces (Ali et al 2021; Gallagher and O’Connor 2021). The use of Alt-Tech spaces thus becomes a fall-back for the deplatformed, though this by no means removes the mainstream from the spread of far-right content.

While mainstream platforms such as X/Twitter have become ideologically and operationally more extreme following its buy-out by Elon Musk, other mainstream platforms continue to host both extreme and far-right content that surreptitiously undermines hate speech policies and user agreements. There is already great difficulty associated with content moderation operations: the sheer volume of content, not to mention the risk involved with consuming such content that is often outsourced to those in the Global South (Stackpole, 2022). Extreme content users further employ key tactics to avoid removal including a careful use of language that avoids directly targeting or naming vulnerable groups to the use of video content that is more difficult to moderate and track due to enforced time constraints and other variables (Mukhopadhyay, 2020; Freihse & Sieker, 2023).

Protesters and riot guards at Liverpool, 3 August 2023

Where to from here?

The silence of Big Tech on their roles in the riots has not gone unnoticed, nor ignored. On August 9, more than 240 civil society groups published an open letter to British and Irish political leaders, including PM Keir Starmer and Taoiseach Simon Harris, condemning the violence and the role of these platforms and explicitly called on political leaders in Britain and Ireland to hold them to account.

In spite of best efforts, content moderation and deplatforming strategies continue to fail to truly curb the spread of such vitriol. How, then, can we begin to curtail the spread of hate-fuelled content and conspiracies that incite acts of violence? And certainly, without exerting further pressure and risk to already vulnerable content moderators?

Prior to the riots, Taoiseach Simon Harris had commented on the need for Big Tech accountability in relation to his personal experience of targeted digital abuse. Harris cited the Online Safety Code which comes into effect at the end of 2024 and has the “ability to hold directors personally responsible because essentially, these social media companies aren’t actually faceless” (Finn, 2024).The legislation, alongside Ireland’s new media regulator, Coimisiun na Mean, will ensure the protection of citizens, particularly children, from online harms by prohibiting cyberbullying and incitements to hatred and violence and particularly cites the harmful impacts of recommender systems as part of this problem (Coimisiún na Meán, 2024).

Niamh McDonald, of Ireland’s Hope and Courage Collective, has relatedly argued a need to completely shut down recommender systems that are known to promote harmful and incendiary content. (Baker et al., 2024). McDonald has stated: “People should decide what they want to see, not Big Techs algorithms” (People vs. Big Tech, 2024). Removal of recommender algorithms would remove the likelihood of extreme and radicalising content reaching vulnerable audiences.

Baker, Ging and Andreasen’s work on recommender systems has further argued a need for harsher sanctions against users who promote harmful and toxic content, alongside wider transparency from tech companies. Integrated into many of these recommendations is the need for greater media literacy and public interventions into disinformation campaigns and digital harms.

While moves towards Big Tech accountability and media literacy are both welcome and necessary, there is further need for regulation of Alt-Tech spaces that needs to be promoted. Indeed, public discussions of the need for social network accountability and transparency largely fail to address the role of Alt-Tech in the spread of political polarisation and discord. Public interventions and governmental campaigns around disinformation and extreme content must go further in addressing the role and operationalisation of Alt-Tech platforms.

References

Ali, S., Saeed, M. H., Aldreabi, E., Blackburn, J., De Cristofaro, E., Zannettou, S., & Stringhini, G. (2021, June). Understanding the effect of deplatforming on social networks. In Proceedings of the 13th ACM Web Science Conference 2021 (pp. 187-195).

Amnesty International. (2024, August 6). UK: Big Tech platforms play an active role in fuelling racist violence. Amnesty International. https://www.amnesty.org/en/latest/news/2024/08/uk-big-tech-platforms-play-an-active-role-in-fuelling-racist-violence/

Baker, C., Ging, D., Maja, D., & Andreasen, B. (2024). Recommending Toxicity: The role of algorithmic recommender functions on YouTube Shorts and TikTok in promoting male supremacist influencers. DCU Anti-Bullying Centre Dublin City University. https://antibullyingcentre.ie/wp-content/uploads/2024/04/DCU-Toxicity-Full-Report.pdf

Buntain, C., Innes, M., Mitts, T., & Shapiro, J. (2023). Cross-Platform Reactions to the Post-January 6 Deplatforming. Journal of Quantitative Description: Digital Media, 3. https://doi.org/10.51685/jqd.2023.004

Coimisiún na Meán. (2024, May 27). Coimisiún na Meán to notify Online Safety Code to European Commission. Coimisiún Na Meán. https://www.cnam.ie/coimisiun-na-mean-to-notify-online-safety-code-to-european-commission/

Copestake, I. (2024, January 23). Inside The Telegram Groups Fuelling Anti-Immigrant Sentiment. District Magazine. https://districtmagazine.ie/features/inside-the-telegram-groups-fuelling-anti-immigrant-sentiment/#:~:text=On%20the%2023rd%20of%20November

Finn, C. (2024, August 7). Harris promises to hit social media firms “where it hurts” through fines and holding owners liable. TheJournal.ie; The Journal. https://www.thejournal.ie/taoiseach-fines-for-social-media-companies-6457192-Aug2024/

Fraser, G. (2024, August 10). A week of unrest in the UK – and a week of silence from big tech. Bbc.com; BBC News. https://www.bbc.com/news/articles/c3ej9e4lqp5o

Freihse, C., & Sieker, F. (2023, July 11). Content moderation is still primarily conducted by humans – here’s a game that helps you empathize . Upgrade Democracy. https://upgradedemocracy.de/en/content-moderation-is-still-primarily-conducted-by-humans-heres-a-game-that-helps-you-empathize/

Gallagher, A., O’Connor , C., & Visser, F. (2023). Uisce Faoi Thalamh: An Investigation Into the Online Mis-and Disinformation Ecosystem in Ireland. In Institute for Strategic Dialogue . https://www.isdglobal.org/wp-content/uploads/2023/11/Uisce-Faoi-Thalamh-Summary-Report.pdf

Gallagher, A and Ciarán O’Connor. 2021. Layers of Lies: A First Look at Irish Far0rught Activity on Telegram. Institute for Strategic Dialogue 12.

Mukhopadhyay , B. R. (2020, August 26). (Open) Secret lives of Content Moderators | Cassandra Voices. Cassandra Voices. https://cassandravoices.com/society/warning-the-open-secret-lives-of-content-moderators/

Murray, J., & Syal, R. (2024, August 9). Two men jailed for social media posts that stirred up far-right violence. The Guardian; The Guardian. https://www.theguardian.com/politics/article/2024/aug/09/two-men-jailed-for-social-media-posts-that-stirred-up-far-right-violence

People vs. Big Tech. (2024, August 12). Turn off the online Hate Megaphone . People vs. Big Tech. https://peoplevsbig.tech/press/turn-off-the-online-hate-megaphone/

Rauchfleisch, A., & Kaiser, J. (2021). Deplatforming the Far-right: An Analysis of YouTube and BitChute. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3867818

Rogers, R. (2020). Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media. European Journal of Communication, 35(3), 213–229. https://doi.org/10.1177/0267323120922066

Sion Tootill. (2024, August 15). UK riots: “Bigot” jailed for Facebook racial hatred posts. Bbc.com; BBC News. https://www.bbc.com/news/articles/c703e03w243o

Stackpole, T. (2022, November 9). Content Moderation Is Terrible by Design. Harvard Business Review. https://hbr.org/2022/11/content-moderation-is-terrible-by-design

——————————————————————————————————————————————————————————

Dr Sarah Anne Dunne is a post-doctoral research assistant and administrator for UCD Centre for Digital Policy.

Her research interests include digital cultures and policies, feminism, gender and sexuality studies and critical theories. She has previously worked with Prof Eugenia Siapera on the IRC Platforming Harm Project, examining the circulation of harmful health narratives during the Covid-19 pandemic and subsequently analyzing the spread of far-right material and anti-democratic (anti-LGBTQ and anti-immigrant) messages on Alt-Tech platforms. Her PhD thesis focused on manifestations of rape culture, victim blaming mentalities, and feminist interventions to emerge on microblogging platform Twitter during 2016-2017. She is currently involved in research related to the growth of far-right political sentiment and activism in Ireland that is emerging online.

 

 

Skip to content