Jennifer Waters
Exclusionary Microtargeting: A Political Practice that Slipped Through the Cracks of EU Regulation on Transparency in Political Advertising
Ireland’s upcoming General Election, its revisiting of Part 4 of its Electoral Reform Act, and the slow rolling out of the EU Digital Service Act (DSA) invites discussion of a kind of microtargeting that has slipped through the regulatory cracks: exclusionary microtargeting. This article outlines the function of exclusionary microtargeting, the gap in current EU policy, and examines whether it can be considered an impermissible political practice with reference to the Doctrine of Double Effect, and the European Convention on Human Rights.
Microtargeting, as it is understood more widely and in policy circles, is the ability to leverage the personal data available on social media platforms in order identify the shared interests of individuals in order to target advertisements to receptive audiences at a more narrow or granular level (Bennet, 2016). Microtargeting leverages geographic location, age, gender, interests, and existing engagement including ‘page likes’, in order to target audiences, as well as offering the ability to make a ‘Custom List’ where individuals can be listed by name to be a part of an audience. Discussion in the literature surrounding the microtargeting of political advertisements highlights the siloing of political information, and the negative effects that microtargeting can have on numerous democratic principles such as voter autonomy, polarisation, and privacy (Bennett, 2016; Barbu, 2014; Schawel, Frener, & Trepte, 2021; Bennett, 2015; Haggarty & Samatas, 2010; Casagran & Vermeulen, 2021). These discussions, however, only view microtargeting when it is used to target ads towards a relevant audience (Dobber, Fathaigh, Borgesius, 2019), not on the ability to use microtargeting to block or exclude audiences from receiving an ad (Meta Business Help Centre, 2024).
Meta will be the platform of focus for this article, as it has taken the lead in its voluntary adherence to the ‘2019 Code of Practice of Disinformation, [where its Ad LIbrary was piloted with the promise of] transparent, fair, and trustworthy online campaign advertising ahead of the European elections in spring 2019’ (Kirk & Teeling, 2022). Formerly the Facebook Ad Library, the Meta Ad Library was created as an archive of political advertisements containing information regarding the targeting techniques, overall spend, and impressions political advertisements on the platform garnered.
Two main points must be considered regarding this form of microtargeting, the applicability of current regulation, and the distinction between non-targeted and excluded audiences. This article will conclude that:
- There is a meaningful distinction between non-targeted and excluded audiences that raises into question not just democratic principles, but human rights and
- Transparency and notification measures required by the DSA and Electoral Reform Act as it stands are not applicable to exclusionary microtargeting.
Regarding the first point, the audiences that were not targeted in the inclusionary form of microtargeting are as isolated from exposure to the advertisement as excluded audiences. To iterate, on Meta, when targeted audiences are ‘locked-in’ to advertisement exposure, non-targeted audiences are ‘locked-out’ to the same extent excluded audiences are (Meta, 2023). Is there, then, a meaningful difference between non-targeted audiences in online political advertising and excluded ones?
The doctrine of double effect (DDE) is a useful tool to differentiate between the motivations underlying inclusionary and exclusionary microtargeting despite the harmful effects of each being similar. Intuitively, it is more permissible in principle for politicians to include and indirectly exclude, rather than to outright exclude citizens- the DDE can expand on such intuition. The DDE holds that it is permissible to cause harm if the harm is a side effect of causing a good effect, even if it would be impermissible to cause the same harm deliberately. While exercises referencing the DDE are usually far more extreme, (tackling questions of terrorism and abortion) the relevant theme is that the permissibility of an act that causes harm depends on whether the occurrence of harm was intentional or merely foreseen (Quinn, 1989; McIntyre, 2019).
Despite being equal in effect (‘effect’ meaning the extent to which individuals and groups are isolated in both forms of microtargeting), the intentions behind the instantiation of non-targeted individuals and excluded individuals are different. When politicians use inclusionary microtargeting, pass out brochures, hang signs, etc., they intend to share political information with their target audience, despite the consequence that citizens outside of the target audience are excluded. If Jack, John, and Jill lived near an area where a candidates’ flyers were being distributed, their not receiving flyers is not the intended effect of politicians. Additionally, if the wind blows the flyers from their adjacent neighbourhood into their mailboxes, their receiving the information does not directly conflict with the intentions of the candidates. Despite Jack, John, and Jill being excluded, the overall aim of sharing political information makes their exclusion more permissible under the DDE.
When using exclusionary microtargeting, however, politicians’ intention is to exclude individuals from being exposed to particular content, preventing the spread of political information. Jack, John, and Jill not receiving a flyer would be deliberate under exclusionary microtargeting, and if the wind blew an adjacent neighbour’s flyer into their mailboxes, the effect would directly conflict with the intent of the politicians. In this scenario, the intention of the politician was to exclude Jack, John, and Jill and would therefore be considered less permissible, or even impermissible, despite their exclusion being permissible for inclusionary microtargeting. Upon the application of the DDE, the explicit nature of exclusionary microtargeting conveys an intent that is distinct from inclusionary microtargeting– where exclusion for inclusionary microtargeting is foreseen, exclusion under exclusionary microtargeting is intentional.
Granted, the DDE is usually applied in much more severe, morally poignant scenarios. The ‘harm’ usually considered alongside the DDE is usually murder, terrorism, euthanasia, etc., making an act’s intentionality automatically qualify as morally impermissible (Foot, 1978). The impermissibility of exclusionary microtargeting, however, invokes a weaker conclusion– just because its use is intentional, the impermissibility of exclusionary political microtargeting isn’t a straightforward derivation. Rather than referencing moral principles, the permissibility or impermissibility of exclusionary microtargeting as a political practice can be assessed in relation to Article 10 and 14 of the European Convention on Human Rights – the right to freedom of expression and the prohibition of discrimination respectively. Human rights carry both moral and political ideals that would make infringements upon them more likely to be considered impermissible in a substantive sense.
Article 10 holds that the right to freedom of expression ‘shall include freedom to hold opinions and to receive and impart information and ideas without interference from public authority and regardless of frontiers.’ (European Court, 2021, p. 12 emphasis added) and Article 14 holds the rights and freedoms listed in the Convention ‘shall be secured without discrimination on any ground such as sex, race, colour, language, religion, political, or other opinion’ (European Court, 2021, p. 13). Considering Article 10 and 14, and the intentionality of exclusion associated with political exclusionary microtargeting, the political practice of exclusionary microtargeting can be considered an infringement on such rights. Exclusionary microtargeting directly interferes with citizens’ freedom to receive information, impart information – as their exclusion from an advertisement also excludes them from its comment section– and does so on the basis of numerous factors listed in Article 14.
Regarding the second point, there is a glaring gap in all EU and Irish policy relating to online advertising and political advertising transparency in particular. Regarding online advertising more generally, Article 24 of the DSA requires that all platforms providing online advertisements ‘shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time:
- That the information displayed is an advertisement;
- The natural or legal person on whose behalf the advertisement is displayed;
- Meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.’ (European Commission, 2020, emphasis added)
Article 58 of the EU Regulation on the Transparency and Targeting of Political Advertising states ‘the requirement that the information about the transparency notice is to be, inter alia, clearly visible should entail that it features prominently in or with the advertisement.’ (European Commission, 2024)
Both the DSA and Regulation on the Transparency of Political Advertising require the notification of online advertisements be on the ads themselves, implying that only microtargeting in its inclusionary sense was considered in the policy design. Both policies are inapplicable to exclusionary microtargeting as both do not address the fact that under exclusionary microtargeting, individuals can be in a target audience and not be provided an ad. Excluded audiences are therefore not notified and do not easily access targeting information about their exclusion if the information is required to be contained on the ad itself. In light of the functions and impact of exclusionary microtargeting, this article would have to disagree with the Regulation that ‘the transparency of political advertising should enable individuals to understand that they are confronted with a political advertisement’ but instead that individuals should understand that they were included in a target audience for a political ad.
References
Barbu, O. (2014). Advertising, Microtargeting and Social Media. In Procedia: Social and Behavioural Sciences, 163, 44-49.
Bennett, C.J. (December 2016). Voter databases, micro-targeting and data protection law: can political parties campaign in Europe as they do in North America? In International Data Privacy Law, 6, 261-75.
Bennett, C.J. (2015). Voter Surveillance, MicroTargeting and Democratic Politics: Knowing How People Vote Before They Do. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2605183
Casagran, C.B., & Vermeulen, M. (2021). Reflection on the Murky Legal Practices of Political Micro-Targeting from a GDPR Perspective. In International Data Privacy Law, 11, 348-359.
Cotter, K., Medeiros, M., & Thorson, K. (2021). “Reach the Right People”: The Politics of “Interests” in Facebook’s Classification System for Ad Targeting. In Big Data & Society, 8.
Council of Europe, European Court. (August 1st, 2021). European Convention on Human Rights. Retrieved from https://www.echr.coe.int/documents/convention_eng.pdf
Dobber, T., Fathaigh, R.O., & Borgesius, F.J. (2019). The Regulation of Online Political Microtargeting In Europe. In Internet Policy Review, 8, 1-20.
Electoral Reform Act, Government of Ireland, (2022).
European Data Protection Supervisor. (20th January, 2021). Opinion 2/2022 on the Proposal for Regulation on the Transparency and Targeting of Political Advertising. Retrieved from, https://edps.europa.eu/system/files/2022-01/edps_opinion_political_ads_en.pdf
European Union, European Commission. (November 25th, 2021). Proposal for a Regulation of the European Parliament and of the Council on the Transparency and Targeting of Political Advertising. Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52021PC0731
European Union, European Commission. (November 24th, 2022). The Digital Services Act Package. Retrieved from https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
Facebook. (2019, January 15th). Bringing More Transparency to Political Ads in 2019. Retrieved from https://www.facebook.com/business/news/bringin-more-transparency-to-political-ads-in-2019
Facebook. (2023, January). Towards Fairness in Personalised Ads. Retrieved from https://ai.facebook.com/blog/advertising-fairness-variance-reduction-system-vrs/
Haggarty, K.D., & Samatas, M. (2010). Surveillance and Democracy. New York: Routledge.
Kirk, N., & Teeling, L. (2022). A Review of Political Advertising Online During the 2019 European Elections and Establishing Future Regulatory Requirements in Ireland. In Irish Political Studies, 37, 85-102.
McIntyre, A. (2019). Doctrine of Double Effect. The Stanford Encyclopaedia of Philosophy. Retrieved February 28th, 2023 from https://plato.stanford.edu/archives/spr2019/entries/double-effect/
Meta. (2022). Set Up Exclusions for Your Meta Advantage +Catalogue Ads Audience. Business Help Centre. Retrieved March 10th, 2023, from https://www.facebook.com/business/help/165516217407801?id=1913105122334058
Meta. (2023). Facebook Business Suite. Retrieved April 1st, 2023 from https://www.facebook.com/business/tools/meta-business-suite
Meta. (2023). Towards Fairness in Personal Advertising. Retrieved April 20th, 2023 from https://scontent-dub4-1.xx.fbcdn.net/v/t39.8562-6/323711583_5456919567751502_3452466074306090865_n.pdf?_nc_cat=102&ccb=1-7&_nc_sid=ae5e01&_nc_ohc=4fHDOCcg3X8AX-yQjh9&_nc_ht=scontent-dub4-1.xx&oh=00_AfBK8KK1U-2K4lwz1dZYPy0uR6je57KswQYNfPW8N4sApg&oe=64326E3A
Prummer, A. (2020). Micro-Targeting and Polarisation. In Journal of Public Economics, 188,
Schawel, J., Frener, R., & Trepte, S. (2021). Political Microtargeting and Online Privacy: A Theoretical Approach to Understanding Users’ Privacy Behaviours. In Media and Communication, 9, 158-169.
Spiekermann, K., & Goodin, R.E. (2018). An Epistemic Theory of Democracy. Oxford: Oxford University Press.
Sunstein, C.R. (2007). Ideological Amplification. In Constellations: An International Journal of Critical and Democratic Theory, 14, 273-279.
Bio:
Jennifer Waters is a PhD student at our Centre, researching NLP and LLM -based legal analytics services. Jen completed our MSc in Digital Policy in 2023 and completed the Graduate Programme at Ireland’s Centre for Effective Services before joining us for her PhD. Her main research interests include AI implementation strategies and the impact of AI on the democratic and legal tenets of rule of law.
She is also currently involved in research regarding the accuracy and ethical implications of the transparency mechanisms and funding options for political advertisements in the Meta Ad Library.