The European Union has introduced new regulations to curb the spread of illegal and harmful content on large digital platforms. The rules, which came into force on Friday, aim to make online services more accountable and transparent.
What are the new rules?
The new rules are part of the EU’s Digital Services Act (DSA), which was adopted in 2022. The DSA applies to 19 very large online platforms and search engines (VLOPs) that have more than 45 million monthly users in the EU. These include social media platforms like Facebook, Instagram, TikTok, X (formerly Twitter), YouTube, Snapchat, LinkedIn and Pinterest; online marketplaces like Amazon, Booking, AliExpress, Zalando and Google Shopping; as well as Wikipedia, Google Maps, Google and Apple’s app stores, and Google Search and Microsoft’s Bing.
The DSA imposes five main obligations on these VLOPs:
- Remove illegal content: VLOPs have to “expeditiously” remove illegal content – as defined by European and national EU laws – when notified by national authorities or individuals. Online platforms have to have clear and easy mechanisms for users to flag content they believe is illegal.
- Platforms have to suspend users who often post illegal content, but not before giving them a warning. Online marketplaces have to make their “best efforts” to check up on their online traders in a bid to stamp out illegal products – everything from fake luxury shoes to dangerous toys. If they realize consumers have bought an illegal product, they have to warn them or else make the information public on their website.
- Keep a lid on harmful content like disinformation and bullying: VLOPs have to assess and mitigate the major risks their platforms pose for society, such as the spread of disinformation, hate speech, cyberbullying and manipulation of public opinion. They have to report on the measures they take to address these risks every six months and submit them to an independent audit every year. They also have to cooperate with researchers and civil society organizations that monitor online harms.
- Empower their users: VLOPs have to give their users more control and choice over the content they see and share online. They have to let users opt out of personalized recommendations and advertisements, as well as provide them with tools to customize their online experience. They also have to inform users about the main parameters that determine the ranking and display of content on their platforms.
- End certain targeted advertisements: VLOPs are banned from using certain types of targeted advertising that rely on pervasive tracking of users’ online behavior, such as micro-targeting based on political opinions or health data. They also have to clearly label and distinguish advertisements from other content on their platforms.
- Reveal closely guarded information about how they work: VLOPs have to disclose important information about their business practices and algorithms to regulators, researchers and users. They have to provide access to data on the content they host, moderate and recommend, as well as explain how their algorithms work and what effects they have on users and society. They also have to notify regulators of any planned changes to their services that may affect users’ rights or public interests.
What are the consequences for non-compliance?
The European Commission will be in charge of enforcing the DSA, with the support of national watchdogs in countries where VLOPs have their European headquarters, such as Ireland. The Commission will be able to issue fines of up to 6 percent of a company’s annual global revenue for serious breaches of the rules. It could also temporarily ban a VLOP from operating within the bloc under exceptional cases of repeated or systemic non-compliance.
The Commission will also set up a European Board for Digital Services, composed of representatives from national authorities and experts, to advise and assist in the implementation of the DSA. The board will also facilitate cooperation and information-sharing among regulators across the EU.
The VLOPs will also have to pay a fee of up to 0.05 percent of their global revenues to fund the Commission’s enforcement work.
How do Big Tech companies react?
The DSA has been welcomed by some Big Tech companies as a way to harmonize the online rules across the EU and create a level playing field for all digital players. For instance, Google said it supports “the DSA’s proportionate approach that puts responsibility where it matters most” and that it looks forward “to working with policymakers as they implement this important legislation”.
However, some Big Tech companies have also expressed concerns about the potential impact of the DSA on their business models and innovation. For example, Facebook said it is worried that “some aspects of the DSA could inadvertently undermine people’s ability to express themselves online” and that it hopes “the final legislation will strike a balance between protecting people’s rights and enabling innovation”.
The DSA has also faced criticism from some civil society groups and activists who argue that it does not go far enough in addressing the power and influence of Big Tech companies over online spaces. They claim that the DSA relies too much on self-regulation and voluntary measures by the platforms, and that it does not provide enough safeguards for users’ rights and freedoms.