The European Commission has announced preliminary findings that Meta is violating the Digital Services Act (DSA) by failing to effectively prevent children under 13 from accessing Instagram and Facebook. While Meta’s terms of service explicitly prohibit users under this age, the EU’s executive body argues that the company’s current safety measures are insufficient and easily bypassed.
Weak Safeguards and Easy Workarounds
According to the Commission, Meta’s primary method for age verification—requiring users to input a birth date during account creation—is fundamentally flawed. Because there are no additional checks to verify the accuracy of this information, minors can simply enter a false date to create an account. This loophole remains open even for “Teen Accounts,” which are designed for users under 16 and also require age input without independent verification.
The core issue is not just the existence of rules, but their enforcement. Without robust verification, self-reported age data is meaningless.
Ineffective Reporting Mechanisms
The Commission also criticized Meta’s user reporting tools as cumbersome and ineffective. To report a suspected minor under 13, users must navigate through up to seven clicks to reach the reporting form. Even when reports are successfully filed, the EU found that there is often no proper follow-up. Consequently, reported accounts frequently remain active without any further scrutiny, allowing the alleged minors to continue using the platform.
Potential Penalties and Future Risks
If these preliminary findings are confirmed, Meta could face fines proportionate to the infringement. Under the DSA, penalties can reach up to 6% of a company’s total worldwide annual turnover. Given Meta’s reported revenue of $200 billion in 2025, such a fine could amount to billions of dollars.
While Meta has introduced age-appropriate content filters for Teen Accounts in 2025, including default restrictions on content rated above PG-13, these measures do not address the root problem: the initial creation of accounts by underage users. The EU’s stance highlights a growing regulatory focus on proactive prevention rather than reactive content filtering.
Conclusion
The European Commission’s findings underscore a significant gap between Meta’s stated policies and its technical implementation. Until Meta develops more robust age verification systems and improves its reporting infrastructure, it remains at risk of substantial financial penalties and continued regulatory scrutiny.





















