The European Commission has issued a preliminary finding that Meta may be violating the Digital Services Act (DSA), citing serious shortcomings in how its platforms—Instagram
and Facebook—protect children under the age of 13.
Although Meta’s own rules prohibit users younger than 13 from joining, regulators say the company’s safeguards are largely ineffective. Children can easily bypass age restrictions by entering false birth dates during sign-up, with no robust verification systems in place to detect or prevent this.
The Commission also flagged flaws in Meta’s reporting tools. Users attempting to report underage accounts must navigate a cumbersome process that can take several steps, and even then, reports often fail to trigger meaningful action. In many cases, accounts suspected to belong to minors remain active without further checks.
At the heart of the issue is what regulators describe as an inadequate risk assessment by Meta. The company’s internal analysis appears to underestimate the scale of the problem, despite evidence across the European Union suggesting that roughly 10–12% of children under 13 are using these platforms.Experts have long warned that younger users are particularly vulnerable to online harms, including exposure to inappropriate content and privacy risks—concerns the Commission says Meta has not fully addressed.
Growing pressure under EU social media rules
This case highlights the broader impact of the Digital Services Act, one of the world’s strictest frameworks governing online platforms. The DSA requires companies to proactively assess and mitigate risks, especially those affecting minors, and to ensure high standards of privacy, safety, and transparency.
Across the EU, social media policy is increasingly focused on child protection. Platforms are expected to implement stronger age verification tools, limit targeted advertising to minors, and design safer default settings. Regulators are also pushing for clearer accountability, requiring companies to demonstrate how their systems reduce harm rather than simply reacting to complaints.
What happens next
Meta now has the opportunity to review the Commission’s findings and respond formally. It may also introduce corrective measures aligned with upcoming 2025 DSA guidelines on child safety. Meanwhile, the European Board for Digital Services will be consulted as part of the process.
If the Commission ultimately confirms its preliminary conclusions, Meta could face significant penalties. Under the DSA, fines can reach up to 6% of a company’s global annual turnover, along with additional daily penalties for continued non-compliance.
The investigation remains ongoing, and no final decision has been made.
