Contact Us

    Amit, Pollak, Matalon & Co.

    APM House, 18 Raoul Wallenberg St.,
    Building D, 6th floor, Ramat Hachayal,
    Tel Aviv, 6971915, Israel

    101 Hebron Road
    Beit Hanatziv, Building B, 3rd Floor


    T. +972-3-5689000
    F. +972-3-5689001
    facebook linkedin

    Media Center / Legal Updates

    The European Parliament and Council Reach Agreement on Digital Services Act

    April 27, 2022

    On the 23rd of April, the European Parliament and the EU Member States have reached a political agreement on the DSA proposal (proposed by the Commission in December 2020), which is now in the final legislative process, subject to formal approval by the two co-legislators.

    The DSA sets out a framework for content moderation and a new standard for accountability of online platforms regarding illegal and harmful content.

    Once formally adopted by the European Parliament and Council, the DSA will be directly applicable across the EU fifteen months after entry into force, or from 1 January 2024, whichever later. Furthermore, regarding very large online platforms and very large online search engines, the DSA will apply four months after entry into force. None-compliance with DSA’s requirements endangers online intermediaries to fines of up to 6% of global turnover and a ban on operating in the EU in case of repeated violations.

    The DSA will force tech companies and specifically social platforms to enforce obligations of removing illegal content and goods more quickly and actively, to be transparent to the users and explaining to users and researchers how their algorithms work, and taking stricter action on the spread of misinformation.

    The DSA shouldn’t be confused with the Digital Markets Act (“DMA”), which was agreed upon in March. Both acts affect the tech world, but the DMA focuses on creating a level playing field between businesses while the DSA deals with how companies moderate and regulate content on their platforms. The DSA will, like the DMA, distinguish between tech companies of different sizes, placing greater obligations on bigger companies. The largest firms, those with at least 45 million users in the EU, will face the most scrutiny.

    The fundamental requirements of the DSA include the following:

    • Implementation of measures to counter illegal goods, services, or content online, such as a mechanism for users to easily flag such content and for platforms to cooperate with ‘trusted flaggers’. A trusted flagger can either be an authority or a user that proved its expertise in detecting illegal content.
    • Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
    • Implementation of measures to empower users, including the possibility to challenge platforms’ content moderation decisions via ADR or judicial redress.
    • Targeted advertising based on an individual’s religion, sexual orientation, or ethnicity or other sensitive data is banned, and minors cannot be subject to targeted advertising either (note the age of minors in the EU is 16 years old as opposed to 13 in the US).
    • Implementation of measures to assess and mitigate risks, such as obligations for very large platforms and online search engines to take risk-based action and undergo independent audits of risk management systems and mechanisms to adapt in reaction to crises affecting public security or health.
    • Provide researchers with key data of the largest platforms and search engines to understand how online risks evolve.

    We will post additional information getting closer to the effective date.

    APM Technology and Regulation Team