Factual Background and Proceedings
In July 2024, the Amsterdam District Court delivered a landmark judgment addressing the practice of “shadowbanning” under the EU Digital Services Act (DSA). The case was brought by Danny Mekić, a privacy expert and academic, against the social media platform X.com (formerly known as Twitter). Mekić noticed that after posting critical comments about new EU child protection legislation, his profile was no longer discoverable via the platform’s search function. Followers alerted him that his account was missing from search results, yet X provided no notification or explanation.
After repeated requests, X eventually informed Mekić that his account had been subject to a temporary visibility restriction due to an automated moderation process. This restriction was later lifted, but Mekić pursued legal action. He sought, among other things, a declaratory judgment that X had breached its contract, compensation for failing to deliver the paid Premium service, and an order requiring X to be transparent about such moderation decisions in the future.
Legal Assessment
The court focused on two central questions:
Does restricting the discoverability of an account (via search delisting or shadowbanning) constitute a breach of X’s contractual obligations?
Does this measure fall under the transparency requirements of the DSA, particularly Article 17?
The court found that discoverability via the search function is an essential feature of the service X provides to its users. Unilateral restrictions on this feature, without valid reason and without clear communication, violate both consumer protection law (Unfair Terms Directive) and the contractual terms agreed with the user. The court referenced the DSA, which obliges platforms to clearly outline their moderation practices in their terms of service and to proactively inform users of any actions that limit the visibility of their content.
Article 17 DSA specifically requires platforms to provide users with a clear and specific statement of reasons for any restriction on visibility, including the nature of the measure, the legal basis, and available avenues for appeal. The court held that X failed to meet this obligation by not providing timely or substantive information to Mekić. Although the damages awarded were symbolic, the ruling is legally significant: it marks the first time a European court has awarded damages for a DSA violation related to shadowbanning.
Implications for Very Large Online Platforms (VLOPs)
This judgment has far-reaching consequences for Very Large Online Platforms (VLOPs) subject to the DSA. Key takeaways:
Transparency Obligation: Platforms must inform users immediately and substantively about any visibility restriction (such as shadowbanning, algorithmic demotion, or search delisting). Explanations provided only upon request or after the fact are insufficient.
Contractual Protection: Essential features like discoverability cannot be restricted without clear, pre-established reasons. Terms of service that allow platforms to unilaterally alter core functions are void if they do not comply with EU consumer protection standards.
Legal Remedies and Compensation: Users can enforce DSA compliance in court and may claim damages for unjustified or non-transparent moderation.
Enforcement at Scale: The ruling highlights that not only individual users, but also regulators, auditors, and researchers, play a role in enforcing compliance among large platforms. The DSA explicitly provides tools for such oversight.
Conclusion
The Amsterdam case makes it unequivocally clear that shadowbanning without transparency and without promptly notifying the affected user is incompatible with the DSA. Very Large Online Platforms (VLOPs) are now legally required under EU law to make their moderation practices fully transparent and to actively inform users of any restriction on the visibility of their content. Failure to comply can result in civil liability, damages, and regulatory scrutiny. The DSA thus enforces a fundamental shift in how user rights and transparency are handled in the European digital sphere.
Martin Arduino
International Legal Affairs
This is friggin’ awesome! Quite a few of these anti social companies really need clean up their shit!!
Instagram needs to be thrown into a EU court right after Pinterest!