Analyzing critical legal trends and developments across data, cyber, AI and digital regulations from around the world and beyond borders

Law No. 15.211 – the Digital Child and Adolescent Statute (“Digital ECA“) was enacted on 17 September 2025. The Law, which in many aspects will still be subject to regulation by the Executive Branch, establishes a comprehensive regulatory framework for the protection of children and adolescents in digital environments. It applies to information technology products and services that are targeted at or likely to be accessed by this audience, regardless of the provider’s location or the nature of the operation.


Contents

  1. Scope of application
  2. Protective design by default
  3. Protective measures
  4. Requirements for app stores and operating systems
  5. Parental supervision
  6. Monitoring tools
  7. Loot boxes
  8. Advertising and monetization
  9. Mandatory reporting to authorities
  10. Procedural rights of users
  11. Transparency and accountability
  12. Adjustments of obligations according to company size and degree of influence
  13. Authority
  14. Sanctions
  15. Presidential vetoes
  16. Entry into force

Scope of application

The Law covers internet applications (with specific rules for social media), computer programs, electronic games, operating systems, and app stores, when targeted at or likely to be accessed by minors. Essential functionalities, such as open technical protocols and standards, are excluded.

Protective design by default

Products and services must, by default, adopt the most protective model of privacy, security, and data protection, based on the best interests of the minor.

Processing of personal data that infringes rights or poses risks to privacy is strictly prohibited.

The protection of a child or adolescent’s privacy, security, mental and physical health, access to information, freedom to participate in society, meaningful access to digital technologies, and overall well-being is considered an expression of their best interests.

Protective measures

Providers must prevent and mitigate risks related to exposure, recommendation, or contact with content involving sexual exploitation and abuse, violence, harassment, cyberbullying, self-harm, suicide, addiction to certain products, gambling, predatory and harmful advertising practices, and pornographic content. They must also assess the content made available to minors according to the applicable age group.

Providers are further required to operate with protective default settings, offer clear information to enable informed choices, conduct risk management, classify content by age group, prevent access to illegal content, and clearly indicate the recommended age rating.

For content, products, or services that are inappropriate or prohibited for individuals under 18, reliable age verification is required at each access attempt, and self-declaration is not permitted as a method. Data collected for this purpose must not be used for any other reason. Additionally, providers of pornographic content must prevent minors from creating accounts.

Requirements for app stores and operating systems

App stores and operating systems must assess age/age groups using proportional and auditable methods, allow parental supervision settings, and signal age information to apps via API with data minimization.

The Law establishes that an Executive Branch regulation will define the minimum requirements for transparency, security, and interoperability in age verification and parental supervision mechanisms adopted by operating systems and app stores.

Service providers must implement technical and organizational measures to ensure receipt of age-related information and prevent unauthorized access by children and adolescents.

In addition to the information received from app stores and operating systems, providers must implement their own mechanisms to prevent minors from accessing inappropriate content according to the applicable age group.

Downloads by minors require the free and informed consent of their legal guardians.

In cases of well-founded suspicion that a given account is being used improperly by a minor, measures must be taken to verify the situation and possibly suspend the account.

Parental supervision

Providers must offer accessible parental supervision tools, with clear notifications when activated, and features to limit and monitor usage time, communication, content, and transactions. When the processing of related personal data is not strictly necessary, a risk assessment and an impact report shareable with the relevant authority are required.

Among other measures, providers must: (i) make information about available parental supervision tools easily accessible to parents or legal guardians; (ii) display a clear and visible notice when parental supervision tools are active, indicating which settings or controls have been applied; and (iii) offer features that allow limiting and monitoring the time spent using the product or service.

Default settings for parental supervision mechanisms must be as protective as possible, including restrictions on communication between minors and unauthorized users, limitations on tools that artificially increase usage time, and control over personalized content recommendation systems, among others.

Monitoring tools

Child monitoring products must ensure the inviolability of information and inform children and adolescents, in age-appropriate language, about the monitoring. Additionally, such products or services must include mechanisms that notify children and adolescents, in language suitable for their age, about the monitoring being carried out.

Loot boxes

Loot boxes are prohibited in electronic games that are targeted at or likely to be accessed by children and adolescents, according to their respective age group.

Advertising and monetization

It is prohibited to use profiling, emotional analysis, augmented or virtual reality to target minors with commercial advertising. Monetization and boosting content that portrays children in adult-like roles or contexts is also forbidden. On social media platforms, accounts of users under the age of 16 must be linked to a responsible adult. If the service is unsuitable for minors, this must be clearly indicated. In cases where there is evidence of irregular account usage, verification measures must be taken, and the account must be promptly suspended.

Mandatory reporting to authorities

Providers must remove and report content that appears to involve exploitation, sexual abuse, abduction, or grooming, as identified in their products or services, to national and international authorities. They must also retain, for at least six months, the following data: user-generated or shared content related to the report, associated metadata, and the data and metadata of the responsible user. The reporting requirements are subject to future regulations.

Providers must also offer mechanisms for notification and removal of content that violates the rights of minors when reported by the victim, legal representatives, the Public Prosecutor’s Office, or relevant entities, without the need for a court order, provided certain formal requirements are met.

Additionally, they must take a series of precautions to prevent misuse of these channels by malicious third parties.

Procedural rights of users

The Law also establishes that, in cases of content removal following notices, procedural rights must be guaranteed to the users involved, including the right to be notified of the removal, along with the reasons and justifications for it.

Users must also be informed whether the content was identified through human or automated action, the possibility of appeal, and the deadlines for submitting and receiving a response to the appeal.

Transparency and accountability

Providers with more than 1,000,000 underage users in Brazil must publish semiannual reports on their websites, which must include, among other things:

  1. The available channels for receiving reports and the systems and processes used for investigation
  2. The number of reports received
  3. The number of content or account moderation actions, categorized by type
  4. Measures adopted to identify child accounts on social media platforms

Adjustments of obligations according to company size and degree of influence

According to the Law, various obligations related to digital protection for children and adolescents will be applied proportionally, considering the type of service, the level of provider intervention, the number of users, and the size of the company. Services with editorial control and licensed copyrighted content may be exempt from these obligations, provided they adopt measures such as age classification, transparency, parental mediation, and reporting channels. The application of obligations will be proportional to the provider’s capacity to influence the content.

Further regulation will define the criteria for this proportionality.

Authority

The new Law establishes that the Brazilian National Data Protection Authority (ANPD) will be responsible for implementing, overseeing, and regulating the Digital Statute for Children and Adolescents. This role was formalized through Decree No. 12.622/2025, which recognizes the ANPD as an autonomous administrative authority for this purpose. The legislation also requires providers to maintain a legal representative in Brazil, empowered to act on behalf of the foreign entity before Brazilian authorities.

Sanctions

The Law establishes a range of penalties, divided between those that may be imposed by the administrative authority and those by the Judiciary. Penalties include warnings, fines (capped at BRL 50 million per violation), temporary suspension of activities, and temporary prohibition from operating.

In determining and scaling potential sanctions, the procedure outlined in Brazil’s Child and Adolescent Statute must be followed, along with the principles of proportionality and reasonableness. This includes assessing the severity of the violation – considering its causes and the extent of harm at both individual and collective levels – any recurrence, the economic capacity of the offender in the case of fines, and the social purpose of the provider and its impact on the community.

Presidential vetoes

The President of the Republic issued three key vetoes: removed the expansion of Anatel’s (Brazil’s National Telecommunications Agency) powers to block digital content, citing a violation of legislative initiative and preserving the agency’s traditional role; excluded the permanent allocation of fines to the National Fund for Children and Adolescents, in line with the time limitation required by the Budget Guidelines Law; and eliminated the one-year vacatio legis, instead establishing, through Provisional Measure No. 1,319/2025, a 180-day period for digital platforms to comply with the new rules.

Entry into force

The Law comes into force within 180 days of its official publication.


LOGO_TrenchRossiWatanabe_Brazil

Trench Rossi Watanabe and Baker McKenzie have executed a strategic cooperation agreement for consulting on foreign law.

Author

Flavia is a partner at Trench Rossi Watanabe* and is based in São Paulo. She has more than 15 years of experience in the areas of intellectual property, franchise, technology transfer, social media and unfair competition. *Trench Rossi Watanabe and Baker McKenzie have executed a strategic cooperation agreement for consulting on foreign law.

Author

Renata Campetti Amaral joined the firm in 2002 and became a partner in 2013. As a Senior Partner, she is the Head of the Environment, Climate Change and Sustainability practice group in Brazil. She is Baker McKenzie's Global Climate Change Lead for Latina America and is the representative of the Consumer Goods & Retail Industry Group in the region. She coordinates the sustainability initiatives conducted by the offices in Brazil.

Author

Flavia Rebello is a partner at Trench Rossi Watanabe. Her practice includes data protection, licensing, sourcing and transactions, franchising and e-commerce and Internet. She is also a regional leader for the IPTech practice in Latin America.

Author

Marcela is a partner in the IPTech practice at Trench Rossi Watanabe.

Author

Felipe Zaltman is a Partner in Baker McKenzie's (Trench Rossi Watanabe) office in Rio de Janeiro, Brazil.

Author

Alexandre is a Senior Associate in Baker McKenzie's Sao Paulo office.