The Digital Services Act Package: A Primer
Niamh Kirk, Elizabeth Farries, Kalpana Shankar, Eugenia Siapera.
On December 15th, 2020, the European Commission submitted its legislative proposal for digital service to the European Council and the European Parliament. The proposal has two components, the Digital Services Act (DSA) and the Digital Market Act (DMA). Together, these constitute the DSA Package. Once the Package is approved, typically within a year and a half of its submission, it will apply to the whole of the European Union. But what is the DSA Package and what will it mean for European citizens and Ireland more particularly? In our analysis at the UCD Centre of Digital Policy, we find that while it represents an important step forward, there are issues that require more clarity and perhaps a firmer steer by the EC.
The DSA is more comprehensive than any previous legislation of the digital world in the European Union and addresses a range of issues, such as content moderation, monetisation, competition and accountability. The European Commission under Ursula von der Leyen has made the digital world a priority with the ‘A Europe Fit for the Digital Age’ initiative and its dual purpose: to assert and strengthen Europe’s digital sovereignty, that is, its capacity to develop innovative new technologies; and to set up its own standards, which derive from Europe’s commitment to fundamental rights for citizens and a competitive free market. The DSA Package constitutes an example of co-regulation, where the regulatory body (the EU in this instance), sets the framework for the operation of the tech industry, but the industry itself is responsible for developing rules for implementation and enforcement mechanisms and for delivering self assessment reports to regulators. It is, in this sense, a light touch approach.
In particular, the Digital Services Act Package is the first proposed major overhaul of digital media regulation since the e-Commerce Directive was passed in 2000. The two main goals of the DSA Package are to protect digital services users and ensure that their fundamental rights are protected; and secondly, to establish a level playing field to foster innovation, growth, and competitiveness. It is expected that the DSA Package will affect a wide range of businesses, from global giants like Facebook to small start-ups in almost every sector. The DSA would replace the e-Commerce Directive, which is focused on creating an environment that facilitates growth of digital companies. To achieve this, it contains an indemnity for platforms hosting or transmitting third party content exempting them from liability. In recent years, experts have argued that platforms must take some responsibility for hosting problematic third party contents, which include disinformation, hate speech, deceitful links (scams), or other types of harmful contents. The Digital Services Act Package therefore aims to address the problems that have emerged from the growth of some companies to near-monopoly status and to ensure good governance.
The first part of the Package is the Digital Services Act (DSA), which addresses platform practices in terms of content management and distribution. The DSA requires companies to take a more active role in monitoring and responding to issues such as political disinformation campaigns or hate speech and applies financial penalties if platforms are in breach. These fines can be up to 6% of the company’s global revenue. The DSA also requires that platforms provide more transparency to users; for example, more information about advert microtargeting will be provided so users understand why a particular ad appears on their feeds. The DSA aims to introduce more accountability for platforms and their practices around content removal. This mainly concerns very large platforms, which are required to proactively mitigate systemic risks that enable disinformation or other harmful contents to spread. In this, the DSA complements the updated Code of Practice on Disinformation which is part of the European Democracy Action Plan.
The second part of the Package is the Digital Markets Act (DMA) which focuses on companies’ roles as ‘gatekeepers’ between businesses and consumers. Here the focus is on ‘leveling the playing field’ and countering the oligopolies set up by large platforms. This is accomplished mainly by setting up stiff fines for anti-competitive practices, which can be up to 10% of the company’s global revenue. For example, a search engine like Google cannot prioritise their own services ahead of a third party business in search results. In online marketplaces, ‘own brand’ items cannot be prioritised ahead of third party products. A second important stipulation of the DMA is to counter illegal trade and increase business transparency. For example, new online businesses will be required to provide much more detailed information which can help authorities identify and prevent sales of illegal goods.
The DSA Package: Implications for Platforms
For platforms, the implementation of the DSA Package means that they will have to adjust their practices in ways that enhance rather than stifle competitiveness and innovation and that allow smaller companies to grow (DMA). Secondly, they have to operate with clear and transparent rules and be accountable to their users (DSA). Additionally, the DSA Package aims to harmonise platforms’ responsibilities across the EU and improve transparency for users and researchers. The new rules apply differently to different size platforms.
Very large platforms, defined as those with a user base that reaches at least 10% of the EU population, or 45 million people, are addressed as ‘Gatekeepers’ because they have “a central role in facilitating the public debate and economic transactions”. Very large platforms are considered to pose a higher risk than smaller, more niche platforms and would be subject to specific obligations regarding risk management. This means Google, Facebook and Twitter will have to ramp up their reporting and open some more windows into their operations. They will need to become more transparent and provide information on recommender algorithms that select and present information on search and social media feeds to users. Very large platforms will also have to arrange for independent investigators and auditors to access and examine algorithms, recommender systems, and content moderation practices to verify compliance. Compliance officers and cooperation with authorities in the case of crises will also be required. Further, the obligations under the DSA require enhanced measures to address illegal content, such as working with ‘trusted flaggers’ to identify and report content. Micro and small companies will still have some obligations under the DSA, but they will not be as extensive as those of big tech with its bigger resources. Rather, obligations will be proportionate to platforms’ ability and size.
The DSA Package: Implications for Citizens
These enhanced obligations of digital service providers aim to improve the digital environment for users. The DSA attempts to crack down on illegal activities online and protect citizens from harm while protecting fundamental rights, including freedom of expression, and the right to privacy. It is a challenging balancing act. Currently, the platforms make decisions on what types of content or accounts to take down. Companies such as Facebook or YouTube can remove communities and individuals without any accountability or need to offer information on who was removed and why. The Act requires digital platforms to be more transparent about what they take down and why, as well as to allow users to challenge any content moderation decisions such as takedowns.
But how does it address illegal or harmful content such as hate speech and disinformation? The DSA retains the exemption from liability for online platforms for content posted by users. However, there are certain obligations regarding risk management and due diligence that must be adhered to. Under the DSA users should have enhanced mechanisms to report illegal content on social media. The platforms will have requirements to respond within set timeframes and will be subject to penalties if they fail to meet targets. In this respect, the main provisions are to strengthen the Code of Practice on Disinformation and the Code of Conduct on illegal contents. In other words, the DSA does not go so far as to define what illegal and harmful content is; these rules are contained in other EU and national legislations.
The DSA Package and Ireland
While the DSA aims to harmonise the approach across the EU by providing a common standard throughout Europe, national level legislation is still an important factor in the regulation of digital services. If national legislation contains definitions of illegal or harmful contents, then these will apply. For example, in Ireland, the Harassment, Harmful Communications and Related Offences Act prohibits the sharing of intimate images without consent; platforms are required to comply with this law, even if it is not covered specifically in the DSA. National authorities therefore have an important role to play. The Data Protection Commissioner will still be responsible for regulating digital platforms based in Ireland with respect to individuals’ data under GDPR.
Once the DSA Package receives approval, the key task for Ireland will be to develop the appropriate office, departments, roles and resources that centralise key responsibilities for digital services and that address the key objective for harmonisation across Europe under the DSA. It is not yet clear how this will be achieved. The DSA requires each state to implement a Digital Services Coordinator (DSC), an independent authority to supervise the compliance of online platforms established in the state. In Ireland, the recent Online Safety and Media Regulation Bill (OSMR) covers similar grounds and makes provision for an Online Safety Commissioner. The OSMR includes plans for a new Media Commission concerned with the transposition of the AudioVisual Media Services Directive, which will regulate audiovisual media and on-demand services such as Netflix and YouTube. The OSMR has to operate in tandem with the DSA. However, in the EU, national law has to comply with European law. If there are cases where the two are in conflict, European legislation overrides national law. Since both legislations are currently under discussion, we will have to wait and see if there are any potential conflicts that arise.
Will it Work?
The DSA Package is a welcome and long awaited regulatory framework for the digital world. In the twenty years or so since the e-Commerce Directive, the importance of platforms and other digital services has increased manifold. As we have seen in recent years, platforms are increasingly shaping economic, political, social and cultural life. Until now, they were operating outside of any clear regulatory framework. The time is ripe therefore for political and state authorities to assert some control over the ways in which platforms and other digital actors operate. The key goals of the DSA are to protect fundamental rights and to foster innovation, growth and competitiveness by levelling the digital playing field. But these two goals are already in tension. Since the DSA seeks to move to a model of co-regulation with the platforms, it may still privilege platforms in taking decisions that may adversely affect citizens’ fundamental rights.
A second source of tension concerns the emphasis on online safety and illegal contents with the fundamental right of freedom of expression. Despite the provision of a regulatory framework, the DSA is still vague on what constitutes illegal and harmful speech, preferring to defer to national authorities to determine this. While some forms of illegal speech are clearly defined, for example through the EC Framework Decision of 2008 on combating certain expressions of racism and xenophobia, platforms themselves determine what constitutes harmful contents. Moreover, while the DSA asks platforms to set up notice and take down procedures, for the most part platforms still monitor the implementation themselves. Once again, regulation takes a light touch approach that does not guarantee the fundamental rights of citizens for both freedom of expression and for protection from harmful contents. The overall framework seems to cede too much to platforms, whose operational concerns and capacities will ultimately determine how contents will be treated. For example, as Article 19 notes, it may be easier for platforms to remove content to avoid any liability.
The DSA Package is due to be discussed by the European Council and the European Parliament before being approved. No doubt some of the proposals will be amended. Indeed, this includes: the gaps in what and how abuses of digital services are addressed are already emerging and national level regulations can only go so far to fill them. Similarly, there is a lack of clarity over definitions and terms that need to be refined to ensure that the DSA captures the wide-ranging activities that constitute harmful contents. The period of time while the DSA is under discussion is important for civil society actors to ensure that their concerns over harmful contents, disinformation and freedom of expression are taken into account. For example, organisations that represent communities targeted by hate speech may require a firmer definition of harmful contents beyond those deemed illegal. Civil liberties organisations may require clear guarantees on freedom of expression.
Nonetheless, substantial change is on the horizon for the platforms that are headquartered in Dublin. The Irish businesses that operate online will have a more equitable shot at being seen across the EU market as well as more responsibilities to address. Lastly, Irish users will enjoy a safer online environment and enhanced avenues for redress if and when something goes wrong.