The European Commission has opened a formal investigation into Snapchat for possible violations of the Digital Services Act (DSA) related to the online safety of minors. The probe follows concerns that the platform may expose children to grooming, recruitment for criminal activities, and content about illegal or age-restricted products such as drugs, vaping devices, and alcohol.
Five areas of concern
The investigation focuses on five key areas:
- Age verification: Snapchat requires users to be at least 13, but relies solely on self-declaration, which is considered insufficient to prevent children under 13 from accessing the platform and to identify users under 17.
- Protection against grooming and recruitment: Adults can build trust with minors online for sexual exploitation or recruitment into criminal activities, often by concealing their age or presenting as minors themselves.
- Default account settings: Children are automatically suggested to other users, and privacy tools are not clearly presented.
- Content on illegal products: Existing moderation tools appear ineffective in limiting exposure to age-restricted or illegal content.
- Reporting mechanisms: Procedures for reporting illegal content are hard to access and may use “dark patterns” to hinder reporting.
Next steps
The investigation builds on a probe initiated by the Dutch Authority for Consumers and Markets (ACM) in September 2025. The European Commission may impose temporary measures, issue a non-compliance decision, or accept commitments from Snapchat to remedy identified violations.
Also read: Trump Xi meeting set for May following delay due to Iran war
For more videos and updates, check out our YouTube channel


