The European Commission has formally opened an investigation into Snapchat, examining whether the social media platform is adequately protecting children in line with the
Digital Services Act (DSA). The move signals growing scrutiny of how major platforms safeguard minors online.
Regulators suspect Snapchat may be falling short in several areas, including exposing young users to harmful interactions such as grooming, recruitment into criminal activity, and content promoting illegal or age-restricted products like drugs, alcohol, and vaping devices.
Key areas of investigation
1. Weak age verification measures
The Commission is questioning Snapchat’s reliance on self-declared age verification. While the platform requires users to be at least 13 years old, authorities believe this system fails to effectively block underage users or ensure appropriate protections for those under 17. Concerns also include the apparent lack of accessible tools for reporting underage accounts.
2. Risks of grooming and criminal exploitation
Officials fear that Snapchat does not sufficiently shield minors from harmful contact. Adults may exploit loopholes—such as falsifying their age—to pose as teenagers, potentially exposing children to sexual exploitation or criminal recruitment.
3. Default settings not child-safe
The platform’s default account settings are also under scrutiny. Features like automatic friend recommendations and enabled push notifications may expose minors unnecessarily. Additionally, users reportedly receive little guidance on adjusting privacy and safety settings during account setup.
4. Spread of illegal product content
Under the DSA, platforms must limit systemic risks. The Commission suspects Snapchat’s moderation systems are not effectively preventing content related to illegal goods or restricted products from reaching users, including minors.
5. Flawed reporting systems
The investigation will also assess whether Snapchat’s reporting tools for illegal content are user-friendly. Authorities are concerned that the system may use “dark patterns”—design techniques that make reporting difficult—and fails to clearly inform users about complaint and appeal options.
What happens next
The European Commission will now conduct a detailed investigation, gathering evidence through requests for information, interviews, and inspections. The process could lead to enforcement actions, including fines, compliance orders, or agreed commitments from Snapchat to address identified issues.
The probe also builds on earlier efforts by national regulators, including the Dutch Authority for Consumers and Markets, which previously examined the sale of vapes to minors via the platform.
Broader context
This investigation is part of the EU’s broader push to enforce the Digital Services Act, particularly its provisions aimed at protecting minors online. The Commission has relied on its 2025 guidelines, which emphasize stricter age verification, safer default settings, and reduced visibility of minors to potentially harmful actors. Photo by Justraveling.com, Wikimedia commons.
