Guest blog: General applicability of the Digital Services Act

Jet Klokgieters

17 February 2024.  On the 17th of February, the Digital Services Act (DSA) will be applicable to all Online Intermediary Services Providers (OISP), such as online platforms, in the EU. A risk oriented regulation, it seeks to address the new risks that can arise in the EU where the majority of people now use online information and transaction services on a daily basis. In the following sections, an overview of important aspects of the DSA will be presented, and the Irish context will be examined in tandem. The presentation will conclude with a critical reflection and suggestions for further exploration.

The DSA targets Very Large Service providers

The DSA outlines more obligations for Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSE)s (Department of Enterprise, Trade and Employment, n.d.). VLOPS and VLOSE are those online platforms that have over 45 million visitors per month (European Commission, 2023b). There are currently 22 designated VLOPs and VLOSEs: VLOPs: Alibaba Aliexpress, Amazon Store, Apple AppStore,, Facebook, Google Maps, Google Play, Google Shopping, Instagram, LinkedIn, Pinterest, Pornhub, Snapchat, Stripchat, TikTok, X (Twitter), XVideo,  Wikipedia, Youtube, Zalando. VLOSEs: Bing, Google Search (European Commission, 2023a).

VLOPS and VLOSEs are obligated to assess four systemic risks in relation to their services (Regulation 2022/206): (1) the risk of the dissemination of illegal content, (2) the negative effect these services can have on fundamental rights, (3) the negative effect these services can have on public security and democratic processes and (4) the negative effect that the design or use of the VLOPs and VLOSEs can have on public health, in particular, on individuals and children’s physical or mental health and gender-based violence. 

VLOPs and VLOSEs are also required under the DSA to adopt appropriate mitigating measures (Regulation 2022/206). They will be held accountable through yearly audits, conducted by independent organisations, where services will need to demonstrate DSA compliance, particularly related to mitigating measures addressing the four systemic risks (Regulation 2022/206; European Commission, 2023b). A second accountability measure in the DSA is the fact that VLOPs and VLOSEs are required to provide access to data upon request by the European Commission. This will allow the European Commision to check whether they are DSA-compliant (Beck & Worm, 2023).

Illegal content

One risk identified in the DSA for which all OISPs are required to take action is the existence of illegal content. While the definition of illegal content is not outlined in the DSA itself, because it is already articulated in other EU laws and the law of Member States (Regulation 2022/2065), the DSA instead outlines rules for the detection, flagging and removal of illegal content (European Commission, 2023b). To that end, the DSA requires that the detecting and flagging of illegal content should mainly be done by ‘trusted flaggers’, which are individuals or organisations that have demonstrated expertise in the field (European Commission, 2023b). Trusted flaggers need to be appointed by each Member State and communicated to the European Commission by February 17th 2024 (European Commission, 2023b). 

Disinformation and harmful content

Other content risks identified in the DSA are disinformation and harmful content. Freedom of expression tensions have precluded definition of those terms in the DSA (European Commission, 2023b). Rather, individual OISPs are still allowed to outline their definition of harmful content in their terms and conditions (TheLegal500, 2023). The European Commission has also established a ‘2022 Code of Practice on Disinformation’ to offer standards for OISPs’ self-regulating of disinformation (European Commission, 2022). OISPs can decide which commitments to sign up for (European Commission, 2022). However, when disinformation or harmful content falls under the four systemic risks related to VLOPs and VLOSEs, these platforms are still required to actively adopt measures to mitigate that risk.

Protecting children

As EU’s Commissioner Thierry Breton has stated: “child protection will be an enforcement priority” in the DSA (The Journal, 2023) and it is expected to become a regulatory model for other jurisdictions, particularly the US (Gill, 2022; Gain, 2023). Article 28 outlines that online platforms that can be used by children need to ensure that a high level of security, safety and privacy is offered to them (European Commission, n.d.a). Furthermore, as the potential negative effect of OISs on children’s physical and mental health is identified as one of the systemic risks for VLOPs and VLOSe, these services are required to adopt appropriate measures to mitigate this risk. These measures could include parental control features and age verification tools (European Commission, n.d.a). One of the most profound child-protection measures in the DSA that applies to all OISPs is the ban on targeted advertising at children (Regulation 2022/2065). In response to this requirement, Snapchat, Google, YouTube, Instagram and Facebook are not allowing advertisers to target their advertising to children (European Commission, n.d.a).


The DSA’s greater transparency requirements mandate from OISPs under Article 14 to communicate their terms and conditions in a way that can be understood by children. Moreover, users including children must be able to understand the measures and tools OISPs use for content moderation (Beck & Worm, 2023). OISPs must further provide a Statement of Reasons (SORs), which will be stored in an European transparency database. The SORs will detail the reasons behind all content moderation decisions, the territorial scope, duration of the decision, and ways in which users can address these decisions (Madden, 2023). As of now, over four billion SORs are stored in the database (European Commission, n.d.b). Platforms that use algorithmic recommender systems are also required to inform their users about how these parameters influence the way in which information on their services is displayed (Beck & Worm, 2023). On top of that, VLOPs and VLOSEs have to inform their users about the options that are available to modify or influence those parameters (Regulation 2022/206). To that end, TikTok has announced that it will allow users to choose to turn off their personalised content (McGowran, 2023a).

Regulating the DSA

The EU Member State in which a service has its headquarter has the power to supervise and enforce the DSA, which is called the country-of-origin principle (Department of Enterprise, Trade and Employment, n.d.), with the exception of VLOPs and VLOSEs. The European Commision shares responsibility with the national authorities for all obligations under the DSA for VLOPs and VLOSEs (European Commission, 2023c). When the European Commission suspects non-compliance with the DSA from a VLOP or VLOSE, it can decide to open an investigation. Investigatory tools include requesting information, conducting interviews, inspecting premises, and requesting access to data (European Commission, 2023c). For example, Thierry Breton sent a letter to X director Elon Musk in October 2023, in which he stated that the EU had indications that X was being used to spread disinformation on the Hamas against Israel attacks, which falls under the scope of one of the four systemic risks (Breton, 2023). Similarly, Meta and Snap received a request from the EU to provide information by December 1st 2023 on how they protect children from illegal and harmful content (RTE, 2023a).

Member States are required to appoint a ‘Digital Service Coordinator’ (DSC) for the enforcement of the DSA in their country. The deadline for the DSCs to be appointed is also February 17th 2024 (European Commission, 2023c). The DSC will be responsible for all matters relating to the supervision and enforcement of the DSA, and for coordinating and cooperating with all national authorities in their own and in other Member States (Department of Enterprise, Trade and Employment, n.d.). 

The Irish context

In Ireland, the Coimisiún na Meán (Media Commission) is the appointed DSC (Department of Enterprise, Trade and Employment, n.d.). Specifically, The Digital Service Coordinator John Evans will work together with the Online Safety Commissioner Niamh Hodnett as acting Digital Service Coordinator (Department of Tourism, Culture, Arts, Gaeltacht, Sport and Media, 2023). Ireland should be seen as an extraordinary source of power with serious ramifications for all EU27 countries (Bowers, 2023), given that 11 VLOPs and VLOSEs have their European headquarters in Ireland. These are: Google Maps, Google Play, Google Shopping, Google Search, Facebook, Instagram, LinkedIn, X, Apple Appstore, Pinterest, Bing (European Commission, 2023c). 

The Media Commision was first established in March 2022, as part of the national Irish Act ‘Online Safety and Media Regulation (OSMR) (Department of Tourism, Culture, Arts, Gaeltacht, Sport and Media, 2020). Next to being the Irish DPC, the Media Commission has a broader remit of media regulation, of which ‘Online Safety’ is one theme (Coimisiún na Meán, n.d.).  Laws that apply to the regulation of Online Safety in Ireland are the DSA, the Irish OSMR, and the EU regulation ‘The EU Terrorist Content Online Regulation’. It is worthwhile to note that unlike the DSA, the OSMR does provide a list of what is considered harmful content. An example a content category on this list is content that is likely to encourage or promote eating disorders (Department of Tourism, Culture, Arts, Gaeltacht, Sport and Media, 2020). The Irish law is thus more precise than the EU’s DSA, and according to the Irish government, the OSMR therefore closes a legal gap in harmful content definition (Department of Tourism, Culture, Arts, Gaeltacht, Sport and Media, 2020).

Criticism of the DSA

The DSA has received criticism in relation to enforcement capacity.  Some critics believe that the task force set up by the European Commission of around 80 officials might be insufficient (Gill, 2022). Long-running cross-border cases that fall under the EU General Data Protection Regulation (GDPR), another ambitious regulatory exercise, are stalled (Gill, 2022). Indeed, experts predict that platforms will fiercely try to defend their practices in court, especially if the DSA encroaches on their core business models (RTE, 2023b). For example, Amazon and Zalando are currently fighting against their categorization of VLOP, arguing that their retailer status means they should not be in the same category as Facebook, Pinterest and Wikipedia (Vigliarolo, 2023; RTE, 2023b).

Moreover, the question remains whether regulatory oversight on content will have meaningful influence as the speed and opacity of social media are bending the curve of public conversation in troubling ways (Howlin, 2023). Indeed, harm stemming from social media does not come from content alone, but also from the design of platforms (McGrath, 2023).


The DSA is a good first starting point to minimise the harms that are published on and spread via OISPs. The transparency obligations ensure that users are better informed about why certain content is removed, how users can address these decisions, and how they can modify the settings of how the recommender systems display the content they see on VLOPs and VLOSEs. However, public information in both the Dutch and Irish media on the Regulation seems limited. Academic research on GDPR awareness has stated that inequalities exist in the public awareness of this Regulation (Rughiniș et al, 2021). As the DSA allows greater control for users, it is important that all individuals who use OISPs are equally aware of the DSA and of the options available to them.   

Furthermore, although it is true the DSA is mostly focussed on content moderation, one of the systemic risks related to VLOPs and VLOSEs is the negative effect that the design of these services can have on public health. As VLOPs and VLOSEs are required to take active measures to mitigate this risk, I advocate that academic research keeps a close eye on these measures taken by VLOPs and VLOSEs. Research on these measures could provide input to the design of all IOSPs and perhaps even to additional legislation in the future. 


Beck, B., & Worm, E. (2023, March 27). EU Digital Services Act’s Effects on Algorithmic Transparancy and Accountability. Mayer Brown.

Bowers, S. (2023, June 20). New media watchdog has ‘real teeth’ to ensure online services comply on harmful content, commissioners say. The Irish Times. 

Breton, T. [@ThierryBreton]. (2023, October 10). Following the terrorist attacks by Hamas against, we have indications of X/Twitter being used to disseminate illegal content & disinformation in the EU. [Tweet]. Twitter. 

Coimisiún na Meán. (n.d.). Online Safety.

Department of Enterprise, Trade and Employment. (n.d.). Digital Services Act (DSA). 

Department of Tourism, Culture, Arts, Gaeltacht, Sport and Media. (2020). Online Safety and Media Regulation Act 2022. 

Department of Tourism, Culture, Arts, Gealtacht, Sport and Media. (2023). Minister Martin announces forthcoming appointment of Executive Chairperson and Commissioners in Coimisiún na Meán. 

European Commission. (2022, July 4). The 2022 Code of Practice on Disinformation. 

European Commission. (2023a). Supervision of the designated very large online platforms and search engines under DSA. 

European Commission. (2023b). Questions and Answers: Digital Services Act*. 

European Commission. (2023c). The cooperation framework under the Digital Services Act.

European Commission. (n.d.a). The Digital Service Act (DSA) explained: measures to protect children and young people online. 

European Commission. (n.d.b). Welcome to the DSA Transparency Database! 

Howlin, G. (2023, February, 29). Ireland is on the frontline of the battle against misinformation. The Irish Times. 

McGrath, C. (2023, September 29). What is online harm: And how do we define it? RTE.  

McGowran, L. (2023, August 4). DSA: TikTok will let EU users turn off personalised content. SiliconRepublic. 

Madden, M. (2023). DSA Transparency Database to Apply to All Online Platforms. Mayes Hayes & Curran.  

Gill, J. (2022, July 12). Can an EU law save children from harmful content online?. Reuters.

Vigliarolo, B. (2023, August, 25). Europe’s tough new rules for Big Tech start today. Is anyone ready? The Register.  

The Journal. (2023). EU opens probes into YouTube and TikTok over child protection measures on platforms.  

Gain, V. (2023, August 25). The DSA is here and Big Tech needs to fall in line. SilliconRepublic.  

Regulation 2022/2065. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a single market for digital services and amending directive 2000/31/ec (Digital Services Act). 

RTE. (2023a, November 10). Meta and Snap must detail child protection measures by December 1 – EU. 

RTE. (2023b, August 24). Big tech braces for roll-out of EU’s Digital Services Act. 

Rughiniș, R., Rughiniș, C., Vulpe, S. N., & Rosner, D. (2021). From social netizens to data citizens: Variations of GDPR awareness in 28 European countries. Computer Law & Security Review(42), 105585

TheLegal500. (November, 6, 2023). User Content Moderation Under the Digital Services Act – 10 Key Takeaways.,matter%20or%20nature%20of%20that 


Skip to content