The European Commission accuses Instagram and Facebook of violating the Digital Services Act by failing to identify and mitigate the risks of minors under 13 accessing their services.

Brussels: Europe and the Arabs

The European Commission has tentatively concluded that Instagram and Facebook, both owned by Meta, have violated the Digital Services Act by failing to identify, assess, and mitigate the risks of minors under 13 accessing their services. According to a statement issued by the Commission's headquarters in Brussels on Wednesday, while Meta's own terms and conditions stipulate that the minimum age for safely using Instagram and Facebook is 13, the measures the company has taken to enforce these restrictions appear ineffective. These measures do not adequately prevent minors under 13 from accessing their services, nor do they detect and immediately remove accounts once they gain access.

For example, when creating an account, minors under 13 can enter a false date of birth that makes them appear at least 13 years old, without effective controls in place to verify the declared date of birth. Furthermore, Meta’s tool for reporting minors under 13 on the platform is cumbersome and ineffective, requiring up to seven clicks to access the reporting form, which is not automatically populated with user information. Even when a minor under 13 is reported for being underage, proper follow-up is often not carried out, and the reported minor continues to use the service without any oversight.

This is based on an incomplete and arbitrary risk assessment that fails to accurately identify the risks of minors under 13 accessing Instagram and Facebook and being exposed to age-inappropriate experiences. Meta’s assessment contradicts substantial evidence from across the EU indicating that approximately 10-12% of children under 13 use Instagram and/or Facebook. Moreover, Meta appears to have disregarded available scientific evidence suggesting that younger children are more vulnerable to the potential harms of services like Facebook and Instagram. At this stage, the Commission believes that Instagram and Facebook should change their risk assessment methodology to evaluate the risks that arise on their platforms within the EU and how these risks manifest. Instagram and Facebook should also strengthen their measures to prevent, detect, and remove minors under the age of 13 from using their services. Social media platforms, particularly Meta platforms, must effectively address and mitigate the risks to which minors under 13 may be exposed, while ensuring a high level of privacy, safety, and security for them.

In exercising their right of defense, both Instagram and Facebook have access to the documents contained in the European Commission's investigation files and can respond in writing to the Commission's preliminary findings. The platforms can take measures to address the violations, in line with the Digital Services Act 2025 Directive on the Protection of Minors. Consultations will be held concurrently with this process with the European Council for Digital Services. If the Commission’s view is ultimately confirmed, it may issue a non-compliance decision, which could result in a fine proportionate to the severity of the violation, not exceeding 6% of the provider’s total global annual revenue. The Commission may also impose periodic fines to ensure compliance.

These findings do not affect the final outcome of the investigation.

These preliminary findings are part of the formal proceedings initiated by the Commission against Instagram and Facebook under the Digital Services Act of 16 May 2024. They are based on an in-depth investigation that included an analysis of risk assessment reports, data, and internal documents from both Instagram and Facebook, as well as the platforms’ responses to requests for information. Throughout these investigations, the Commission’s work was supported by numerous civil society organizations and child protection experts across the European Union.

The Commission used the Digital Services Act 2025 guidelines on the protection of minors as a benchmark for assessing Instagram and Facebook’s compliance with their obligation to ensure a high level of privacy, safety, and security for minors. The Digital Services Act (DSA) guidelines define age estimation and verification as a suitable and proportionate means of ensuring a high level of privacy, safety, and security for minors. For age verification techniques to be effective, they must be accurate, reliable, robust, non-intrusive, and non-discriminatory.

The European Commission has developed a blueprint for implementing age verification at the EU level, which can serve as a framework for age verification using a user-friendly and privacy-preserving method.

The Commission is continuing its investigations into other potential violations that fall under these ongoing proceedings, including Meta's compliance with the DSA obligations to protect minors and the physical and psychological well-being of users of all ages. This investigation also includes assessing and mitigating the risks arising from the design of Facebook and Instagram's web interfaces, which may exploit minors' vulnerabilities and lack of experience, potentially leading to addictive behaviors.

Share

Related News

Comments

No Comments Found