Meta failed to implement necessary safeguards to block children under 13 from Instagram and Facebook, in violation of a European Union law on online safety, officials said Wednesday.
Meta does not have an adequate system to identify and remove the accounts of children who flout the social media giant’s age limits, the European Commission, the executive branch of the European Union, said in a preliminary ruling. Without changes, Meta could face fines and other penalties.
European regulators are aggressively cracking down on social media companies over child safety. Snap and TikTok have also been targeted by regulators in Brussels, while the governments of Spain, France and Denmark are among those considering new rules to stop young people using social media.
Regulators said Meta appears to violate the Digital Services Act, a law passed in 2022 to force social media companies to police their platforms more aggressively. The company was found to lack effective controls to verify the accuracy of a person’s self-declared date of birth when creating an account, making it easy to circumvent rules intended to keep children under 13 away from social media sites.
Regulators said Meta’s tool for reporting minors is “difficult to use and inefficient,” with up to seven steps required to access the necessary form. When a minor is flagged as under 13, the company often doesn’t follow up and the user can continue using the service without any review, regulators said.
Across the European Union, regulators say, about 10 to 12 percent of children under 13 access Instagram and Facebook.
“Instagram and Facebook are doing very little to prevent children under this age from accessing their services,” Henna Virkkunen, executive vice president of the Commission for Technological Sovereignty, Security and Democracy, said in a statement. “Terms and conditions should not be mere written statements, but rather the basis for concrete actions aimed at protecting users, including children.”
The European Union, along with several countries in the 27-nation bloc, is exploring new online age verification tools to prevent young people from accessing certain content.
Meta said she disagreed with the commission’s findings, calling age verification an “industry-wide challenge.”
“We are clear that Instagram and Facebook are intended for people aged 13 and over and we have put measures in place to detect and remove the accounts of anyone under this age,” the company said in a statement. “We continue to invest in technologies to find and remove underage users and will have more to share next week on additional measures that will be rolled out soon.”
Europe has for more than a decade been the world’s strictest regulator of the technology industry on issues of privacy, anti-competitive business practices and illegal online content. Authorities continued their investigations into U.S. companies even as the Trump administration threatened retaliation.
The European Union is also investigating Meta over other issues, including whether Facebook and Instagram have addictive design, as well as a case involving its recommendation systems.
In the United States, Meta and other social media companies are also facing increasing scrutiny over child safety. In March, Meta and YouTube were found guilty by a California jury of harming a young user’s mental health through addictive designs and other features.
The European investigation into Meta’s age verification tools began in 2024. After preliminary accusations were made on Wednesday, the company has the opportunity to provide a response to regulators. A final decision on potential sanctions may take more than a year.
The commission can impose a fine of up to 6 percent of Meta’s global turnover, although such a large sanction is extremely rare. Both parties can also reach a settlement to resolve the matter.
Jeanna Smialek contributed reporting from The Hague, Netherlands.




