Analyzing critical legal trends and developments across data, cyber, AI and digital regulations from around the world and beyond borders

Introduction

The protection of minors in the digital environment has become a central concern. At the international level, while the 1989 United Nations Convention on the Rights of the Child established the principle of the “best interests of the child,” it is the 2021 General Comment No. 25 that specifically addresses children’s rights in the digital context. The OECD has also highlighted the need for a trusted and inclusive digital future, emphasizing the importance of safeguarding minors online, as reflected in its recent reports on age assurance1 practices and regulatory frameworks.

The EU is also strongly committed to protecting minors online, particularly through its Better Internet for Kids+ (BIK+) strategy, which aims to promote a safer and more empowering digital environment for children across Europe. In her 2025 State of the Union speech, European Commission President Ursula von der Leyen highlighted the urgency of protecting minors online, explicitly citing Australia as a pioneering example for having introduced a minimum age for social media access.

The European Parliament has thus recently adopted a resolution containing a set of recommendations aimed at strengthening the protection of minors online. These recommendations notably call for the reinforcement of the Digital Services Act (DSA) in order to incentivize better compliance. The resolution nevertheless highlights that there is currently a fragmented approach to age assurance measures across the EU and that some Member States have implemented advanced measures to enhance the protection of minors online.

France is one of the pioneering Member States in the field of online age assurance. For several years, French law has required strict age verification for access to online gambling and betting services, obliging operators to systematically ensure that users are of legal age before granting access. Online age assurance has been addressed through legislative measures in other areas, focusing on access to social networks on the one hand, and access to pornographic content on the other.

In this article, we provide an overview of the state of these two French regimes on online age assurance, after having detailed the state of implementation of the DSA – which itself must comply with General Data Protection Regulation (GDPR).

EU

GDPR: not imposing age assurance but supervising it

Recital 38 of the GDPR states that the Regulation aims to strengthen children’s protection. This objective is pursued through various measures, including increased transparency of information under Article 12(1) (and Recital 58) and the right to erasure under Article 17(1)(f) (and Recital 65). The choice of a legal basis under Article 6(1) may also require considering whether the data subject is a child. The legitimate interest legal basis explicitly requires it. The same requirement to consider whether the data subject is a child is mentioned in the recent Digital Omnibus Regulation proposal, which in Article 3 suggests adding an Article 88c to the GDPR to facilitate the use of legitimate interest for the development and operation of AI systems (see also Recital 44 of the proposal).

When the legal basis relies on the child’s consent and the processing involves information society services, parental involvement is required under Article 8, either through direct consent or authorization, until the child reaches the age set by national law. The GDPR defines a child as anyone under 16, but allows Member States to lower this threshold to a minimum of 13 years of age, which France has done by setting the age at 15.

Data controllers are expected to make efforts to verify users’ age and, according to the 2020 European Data Protection Board (EDPB) guidelines on consent, the measures deployed should be “proportionate to the nature and risks of the processing activities”. As a result, the GDPR does not impose any strict or absolute obligation to verify age.

However, the GDPR does provide a framework for data processing activities aimed at age assurance. Accordingly, the EDPB has issued a Statement on Age Assurance in February 2025 which outlines 10 key principles. The best interests of the child must be prioritized, and age assurance systems must be designed with a risk-based and proportionate approach, while also preventing data protection risks and ensuring effectiveness and security. The other principles reflect even more directly GDPR measures, such as purpose limitation and data minimization, lawfulness, fairness and transparency, data protection by design and by default, accountability or measures about automated decision-making. Although this statement does not mark a paradigm shift, it explicitly addresses the application of the DSA. In this respect, the EDPB also published in September Guidelines 3/2025 on the interplay between the DSA and the GDPR for public consultation, notably addressing the age assurance mechanisms of this regulation.

DSA: age assurance mechanisms progressively specified

At the EU level, the DSA represents a landmark in the regulation of online platforms. Entering into force in 2024, the regulation aims to harmonize the rules governing digital services across the EU, with a particular focus on protecting fundamental rights and ensuring a safe online environment for all users, especially minors.

Article 35 of the DSA requires “very large online platforms” and “very large online search engines” to implement appropriate risk mitigation measures, listing age verification as a potential option. The Commission has shown that it is prepared to enforce these obligations. Indeed, the Commission opened investigations into the practices of major pornographic platforms in May, focusing in particular on age verification tools as a risk mitigation measure under Article 35 of the DSA. In November, the European Commission has also published a report on the most prominent and recurrent systemic risks as well as mitigation measures. Future editions of this report will also aim to identify evolving best practices in the mitigation of systemic risks, of which age verification could potentially be a part.

Article 28(1) of the DSA requires “providers of online platforms” to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service”. Although this article does not mention age assurance, its application will indirectly make it mandatory in certain situations.

Under the powers conferred by Article 28(4) DSA, the European Commission published guidelines in July 2025 (officially adopted in October 2025) to assist providers of online platforms in the application of Article 28(1). These guidelines set out general principles applicable to all measures under this article, as well as specific requirements by type of measure.

Online age assurance is explicitly identified as one type of means of protecting minors. The guidelines explicitly call for consideration of the aforementioned EDPB statement on Age Assurance. They also list five key criteria for effective age assurance: accuracy, reliability, robustness, non-intrusiveness, and non-discrimination. Other measures are recommended, such as offering users a choice of age assurance methods, using independent third parties, and adopting double-blind solutions to safeguard user privacy.

A first link already existed between the DSA and the eIDAS 2.0 Regulation. Where providers of very large online platforms under the DSA require user authentication for access to online services, Article 5f(3) of the eIDAS 2.0 Regulation requires them to accept and facilitate the use of the Digital Identity Wallet (DIW) for such purposes upon the user’s request.

Another link has now been formalized in the other direction as the Commission has also published an age-verification blueprint to ensure that the development of a European age verification solution under the DSA can be based on the architecture of the DIW . Pilot projects are underway in several Member States, including France, with the aim of creating interoperable, privacy-preserving systems that can be used across the EU. The resulting “mini-wallets” could theoretically be used for purposes other than accessing digital services under the DSA, such as purchasing alcohol. 

FRANCE

SREN law and access to pornographic content: a framework specifying which mechanism to implement

The legal framework is based on Article 227-24 of the French Criminal Code, which ARCOM (Regulatory Authority for Audiovisual and Digital Communication) is tasked with enforcing in relation to “online public communication services” pursuant to Law No. 2020-936 of 2020. A substantial litigation has arisen from this framework and remains ongoing before the Paris Judicial Court, which stayed proceedings in 2023, as well as before the Conseil d’État (the highest administrative court in France), which also stayed proceedings in 2024 pending the response to its preliminary questions referred to the CJEU, on which the Advocate General’s opinion was published in September. 

Adopted on May 21, 2024, the SREN law (law aimed at securing and regulating the digital space) represents an important update to this framework. It grants ARCOM broader powers to enforce compliance, including the authority to impose administrative fines and block non-compliant sites without judicial intervention. The authority remains responsible for “online public communication services”, but now also supervises “video sharing platform service providers”. The SREN law, by default, applies only to services established in France or outside the EU. In February 2025, an order extended the law’s scope to include 17 services offering pornographic content and based in various EU Member States. This order was challenged before the courts, but the claims were ultimately all unsuccessful, even before the Conseil d’État.

ARCOM is also required by law to publish a technical framework specifying the minimum technical requirements applicable to age assurance systems implemented to restrict minors’ access to pornographic content. Adopted in October 2024, this technical framework requires in particular the use of independent third-party providers and that at least one assurance method offered to users complies with the “double-blind” principle (implying that neither the website nor the age assurance service knows the user’s identity, and the website does not learn any personal data from the age assurance process). This concept is also used in parallel in the aforementioned DSA guidelines, which encourage providers of online platforms to adopt double-blind age assurance methods.

ARCOM has exercised its enforcement powers by issuing formal notices to major websites, prompting most targeted services to comply. The authority has announced that it will continue its efforts, now focusing on smaller websites too.

Digital majority law for access to social networks: a false (re)start?

The French “digital majority law” of July 7, 2023, follows the approach applicable to information society services set out in Article 8 of the GDPR and its French implementation. It seeks to restrict access to social media platforms for those under 15 without parental approval, reflecting a nuanced approach that balances protection with autonomy.

However, the implementation of this age assurance legal framework has been hampered by legal and procedural challenges. The absence of an implementing decree has prevented its entry into force. This situation is due to concerns raised by the European Commission regarding compliance with the country-of-origin principle of the E-Commerce Directive (Directive 2000/31/EC), limiting the ability of Member States to impose general obligations on services established in other EU countries.

The French government did not stop there, and has closely followed European developments. In November, French MPs tabled a new legislative proposal, extensively interpreting the aforementioned DSA guidelines as marking “a major shift in the position of the European Commission, which now paves the way for national legislation on the age of access to social networks”. Regardless of the interpretation chosen, any national measure should in any case be compatible with the enduring constraints of the country-of-origin principle.

Putting these considerations aside, this proposal aims to prohibit access to social networks for those under 15 and to suspend existing accounts belonging to such minors. In this respect, the proposal partially reprises the provisions of the earlier digital majority law. For consistency, this article also requires social network platforms to implement age assurance systems compliant with a technical framework set by the ARCOM, such as is currently the case for access to pornography. Another key measure of this proposal is the introduction of a “digital curfew” for minors aged 15 to 18. This measure is intended as an effective way to limit screen use during rest hours and protect adolescents’ sleep. The legislative proposal will now follow the legislative process in France.

Conclusion

Online age assurance is becoming a cornerstone of digital regulation in Europe. While the EU set harmonized principles through the DSA, France is pushing ahead with a strict framework for access to pornography and continues its attempts to establish a framework for access to social networks. While these regimes are still under development, other projects are currently under discussion.

The CSAR (combat child sexual abuse regulation) proposal published in 2022 is still in the legislative process. Commonly referred to as “Chat Control”, this proposal requires  “providers of interpersonal communications services” in certain circumstances(Article 4.3) and “providers of software application stores” (Article 6.1.(c)) to “take the necessary age verification and age assessment measures to reliably identify child users on their services”. The project is strongly criticized due to concerns about privacy, encryption, and mass surveillance. Despite modifications, controversial elements remain in the recent Council position from November. This position retains the same age verification measures while adding some general safeguards.

The Audiovisual Media Services Directive of 2010, already revised in 2018, also includes measures aimed at protecting minors, which most Member States have implemented through tools such as content rating, parental controls, or age assurance. Currently under evaluation, with a deadline set for 2026, the European Commission has already committed a work program for 2026 to propose an update of rules on audiovisual media services for Q3 2026.

Looking ahead, the European Parliament’s November resolution on the protection of minors can also be mentioned again, as it calls for the establishment of a harmonized European digital age limit for social media platforms, video-sharing platforms and AI companions that present risks to minors. The age limit mentioned in the resolution would be 16 by default, unless parents or guardians have given permission (with an “harmonised European digital age” limit of 13, under which no minor could access social media platforms). While the recommendation acknowledges the uneven level of protection of minors online resulting from different national approaches, it also recognizes the importance of ongoing discussions at both national and EU levels. It will therefore be essential to continue to closely monitor developments in national frameworks, starting with the compatibility of the new French legislative proposal on access to social networks with the European framework.

We acknowledge Alexandre Humain-Lescop from our Paris office for their contributions to this piece.


Author

Magalie Dansac Le Clerc is a partner in Baker McKenzie's Paris office. A member of the Firm's Information Technology and Communications Practice Group, she is a Certified Information Privacy Professional (CIPP).

Author

Marlyse Lissan joined Baker McKenzie in July 2021. Marlyse is a member of the Information Technology and Communications team and focuses on new technologies, computer technology, Internet and telecommunications.

Author

John is an associate in Baker McKenzie's Intellectual Property and Technology team, based in London. He joined the Firm in 2016 and was admitted as a solicitor in England and Wales in 2018. He is also admitted as an attorney in the state of New York. His practice encompasses aspects of commercial, technology and intellectual property law. He has a particular focus on data protection.