Meta has been found in breach of European Union law for failing to provide simple and effective ways for users to report illegal content on Facebook and Instagram. The European Commission said the platforms created unnecessary steps that could confuse and dissuade users from submitting complaints.
In a preliminary finding released on Friday, the EU’s executive body said Meta’s reporting mechanisms used “dark patterns” that complicated the process of flagging illegal content, including child sexual abuse material and terrorist-related posts. The commission said this may render Meta’s content removal system ineffective. Meta denies any violation of the Digital Services Act (DSA).
“When it comes to Meta, neither Facebook nor Instagram appear to provide a user-friendly ‘notice and action’ mechanism for flagging illegal content,” the commission said.
A senior EU official noted that the case also touches on freedom of expression, as overly complex moderation may inadvertently limit speech. Previous concerns have included allegations of “shadow banning,” where certain content is demoted by algorithms.
The commission found that Meta’s appeal system for blocked content or suspended accounts also lacks transparency. Users cannot adequately explain their case or provide evidence to support appeals, limiting the effectiveness of the mechanism.
The investigation highlighted safety concerns, particularly for young users. Last month, a Meta whistleblower claimed that most new safety tools on Instagram were ineffective, leaving children under 13 vulnerable. Meta said parents have robust tools, including mandatory teen accounts and a new PG-13-style control system to monitor usage.
Simplifying the reporting system could also help platforms combat disinformation. The commission cited an example of a deepfake video in Ireland falsely claiming a presidential candidate was withdrawing from an election.
The investigation is ongoing and coordinated with Ireland’s digital services regulator, Coimisiún na Meán, since Meta’s EU headquarters are in Dublin. The EU also preliminarily found that TikTok and Meta failed to provide researchers with sufficient access to public data, limiting oversight on minors’ exposure to harmful content.
“Allowing researchers access to platform data is an essential transparency obligation under the DSA,” the commission said, noting its importance for public scrutiny of social media’s impact on health and safety.
Platforms have a chance to comply with the commission’s demands before facing fines of up to 6% of their total global annual turnover, along with potential periodic penalties to enforce compliance.
Henna Virkkunen, EU commissioner for tech sovereignty and democracy, emphasized: “Our democracies depend on trust. Platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.”
Meta stated it disagrees with the EU’s preliminary findings and is negotiating with the commission. The company said it has improved reporting, appeals, and data access since the DSA came into effect.
TikTok said full data sharing could conflict with GDPR rules, but it has granted research access to nearly 1,000 teams and is reviewing the EU’s findings. The platform urged regulators to clarify how DSA and GDPR obligations should be reconciled.
The EU’s findings underscore the need for transparency, accountability, and effective user tools to report harmful or illegal content, aiming to protect both safety and freedom of expression across social media platforms.






