EU’s Digital Services Act (DSA)
Context: The European Parliament and European Union (EU) Member States announced that they had reached a political agreement on the Digital Services Act (DSA), landmark legislation to force big Internet companies to act against disinformation and illegal and harmful content and to protect internet users.
- The proposed Act will work in conjunction with the EU’s Digital Markets Act (DMA) which was approved in March 2022.
Key provision of DSA
Instead of letting platforms decide how to deal with abusive or illegal content, the DSA will lay down specific rules and obligations for intermediary companies to follow.
- Faster Removal: Online platforms and intermediaries such as Facebook, Google, YouTube, etc will have to add “new procedures for faster removal” of content deemed illegal or harmful.
- Informed decisions: Further, these platforms will have to clearly explain their policy on taking down content; users will be able to challenge these takedowns as well.
- Flagging Illegal content: Platforms will need to have a clear mechanism to help users flag content that is illegal. Platforms will have to cooperate with “trusted flaggers”.
- Systemic Analysis: The DSA adds “an obligation for very large digital platforms and services to analyse systemic risks they create and to carry out risk reduction analysis”. This audit for platforms like Google and Facebook will need to take place every year.
- Independent Audit: The Act proposes to allow independent vetted researchers to have access to public data from these platforms to carry out studies to understand these risks better
- Ban on Dark Patterns: The DSA proposes to ban ‘Dark Patterns’ or “misleading interfaces” that are designed to trick users into doing something that they would not agree to otherwise. This includes forcible pop-up pages, giving greater prominence to a particular choice, etc.
- Crisis Situation: The DSA incorporates a new crisis mechanism clause — it refers to the Russia-Ukraine conflict — which will be activated by the Commission and be in force for 3 months where special measures will be imposed
- Transparency: It also proposes “transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users”.
- Protection of Minors: The law proposes stronger protection for minors, and aims to ban targeted advertising for them based on their personal data.
- Consumer convenience: Finally, it says that cancelling a subscription should be as easy as subscribing.
- Penal Provisions: Penalties for breaching these rules could be huge — as high as 6% of the company’s global annual turnover.
Does this mean that social media platforms will now be liable for any unlawful content?
- It has been clarified that the platforms and other intermediaries will not be liable for the unlawful behaviour of users. So, they still have ‘safe harbour’ in some sense.
- However, if the platforms are “aware of illegal acts and fail to remove them,” they will be liable for this user behaviour.
Connecting the dots: