
I. Introduction
The German Federal Agency for Child and Youth Protection in the Media (Bundeszentrale für Kinder- und Jugendmedienschutz – “BzKJ”), which is also responsible for enforcing the minor protection rules of Art. 28 DSA in Germany, has just published its official quarterly journal, BzKJAktuell, on 27 February 2026. The main focus of this quarter’s publication is AI, which is discussed in various essays and articles. However, the publication also revealed that, as early as December 2025, the authority had blacklisted (‘indexed’) an unnamed GenAI tool, which allowed among other things text-to-image generation. A blacklisting decision has severe consequences for the distribution of an online service, regardless of whether it can also be used for legal purposes. The BzKJ article also summarises the reasons for the decision in detail, providing important background information on the features, circumstances and circumvention possibilities of the existing safety measures that resulted in the ban. These details should be closely observed to prevent similar measures being taken against other tools.
II. The BzKJ and its powers
The BzKJ evolved from the former Federal Review Board for Media Harmful to Young People (“BPjM”), which was primarily responsible for indexing content harmful to minors until the reform of the German Youth Protection Act in 2021. In this former role, the BPjM focused almost exclusively on banning the public distribution and promotion of media such as films, video games, music, books and websites if they were considered as harmful to the development of minors. With the reorganization, the BzKJ continues this classic task of indexing in a specialised sub-department which is still called BPjM and continues to administrate the list of media harmful to minors.
In addition, the authority now assumes comprehensive responsibilities for monitoring platform operators and their precautionary measures to protect minors. A central role is played by the Office for the Enforcement of Children’s Rights in Digital Services (“KidD”) set up at the BzKJ, which is responsible for enforcing the DSA’s minor protection rules as a specialised sub-department. This department operates independently and ensures that online platforms comply with EU-wide youth protection requirements, investigating and sanctioning violations as part of supervisory procedures.
III. Impact and sanctions in case of violations
As soon as a medium or online service is added to the list of media harmful to minors, it is subject to far-reaching distribution restrictions and a strict advertising ban. Indexed titles may not be publicly displayed or advertised, nor may they be offered by mail order or online without very strict age verification (including face-to-face checks). Anyone who violates these regulations can face criminal penalties ranging from fines to imprisonment for up to one year. In the case of administrative offences, fines of up to €500,000 may also be imposed.
In the digital space, the handling of online media, websites, apps and online tools differs significantly from that of physical media. While indexed titles of DVDs, Books, CDs, etc. are published in a public section of the list, online services are entered in a separate, non-public section of the list of media harmful to minors. This secrecy is considered necessary to avoid a so-called ‘booster effect’: if the URLs were listed publicly, the prohibition list would inadvertently function as a collection of links and a guide to precisely those contents that are actually supposed to be restricted. In order to achieve an effective protective effect nonetheless, the BzKJ works closely with providers of search engine operators and youth protection software. A key tool in this process is the so-called BPjM module. This interface transmits the indexed web addresses to search engines, which then filter them out of their results lists. Since this severely impairs the findability of the content on the internet and makes it almost impossible for potential customers or users to find it without knowing the exact URL, indexing is often referred to as the economic death of an online service or medium in Germany.
IV. Key points of the decision
The subdomain of the GenAI tool has been added to the non-public section of the list of media harmful to minors. The BPjM based its decision primarily on the following points:
1. Generation of pornographic content, including CSAM
The BPjM justified the ban on the grounds that the tool was highly harmful to minors, as, in addition to numerous legal uses, it also enabled the creation of sexualised depictions of adults as well as child and youth pornographic media content (“CSAM”), and previously created content of that kind could also be found within the forum and gallery areas that the service also offered.
The English-language service was able to generate photorealistic images from any text input, depending on the desired style. In addition, the service had other functions that enabled the creation of prompts. The generator could be used free of charge, without any time or quantity restrictions and without prior registration.
The BPjM saw this as a considerable risk that children and young people would regard the depiction of virtual sexual acts involving minors as normal or even legitimate. Overall, the BPjM considered that low-threshold access to depictions of abuse of children and young people was highly ‘socially and ethically disorienting’.
2. Generation of AI chatbots
According to the BPjM, the images created could also be used as the basis for generating a chatbot with specific character traits. Depending on the orientation of the character traits or the subsequent dialogue, it was possible to manipulate the chatbot to output textual representations of sexual violence against minors, which the BPjM considered a particularly serious aspect. According to the BPjM, a lack of adequate content moderation and protection mechanisms allowed that the chatbots could also be used for sexting with a virtual minor. In particular, it was possible to overcome the AI chatperson’s initial reluctance to engage in sexual content through skilful prompting. The committee saw this as an opportunity to practise cybergrooming with real minors.
3. Circumventability of the existing security architecture
The tool featured a security architecture in the form of rejecting certain prompts (known as a prompt blacklist) and preventing the output of harmful content (discriminators). However, the service also offered a forum area where, among other things, the circumvention of the platform’s security mechanisms through skilful prompting was discussed. Prompts were also shared there that enabled the creation of virtual depictions of abuse of children and young people, and references to gallery areas of the platform that showed such content were disseminated.
The BPjM assessed the possibility of such forum exchanges on prompts, which are suitable for circumventing the security mechanisms of the text-to-image generator in order to create depictions of abuse, as particularly problematic. By viewing the prompts used to create the corresponding images, minors could use them themselves in identical or individually modified form in their own creations. In particular, children and adolescents could see the possibility of circumventing existing restrictions as a challenge. This would not only lead to the creation of further abuse material, but also encourage minors to find their own ways of circumventing the restrictions. In addition, minors could perceive the successful overcoming of security measures as achievement.
4. Weighing of interests
In the opinion of the committee, the interests of protecting minors would outweigh the freedom of expression and artistic freedom of the providers. Although the service offering would also contain a large amount of unproblematic content and usage options, these are offset by virtual depictions of sexual violence and CSAM content. The BPjM saw an incalculable potential for creating such content with the help of the tool. In the opinion of the BPjM, the commands distributed in the tool itself represented only a fraction of the potential ways of circumventing the implemented content moderation. There would be a risk that children and young people perceive the creation option as a low-threshold entry point into corresponding sexual tendencies and that desensitisation and normalisation effects would occur as a result of the perception of sexualised violence against minors.
V. Outlook and further comments
- It should be noted that the BPjM put a particular emphasis on the fact that the service’s users discussed circumvention techniques in the service’s own discussion boards. Similar services should therefore setup and enforce strict moderation guidelines which prohibit users from discussing circumvention techniques.
- In addition, it is recommended to generally monitor the internet to find out about circumvention trends that might be discussed on other platforms to detect these in time in order to prevent the generation of unlawful content.
- Prompt blacklists and the tool-internal prohibitions of the generation of certain content should continuously be developed further.
- The blacklisting procedure and the enforcement of the DSA are two separate processes, despite both taking place under the BzKJ umbrella.
- Services that are potentially subject to future blacklisting procedures should be aware that they have the right to submit written arguments in their defence prior to the BPjM hearing that decides on the blacklisting application. These arguments may be supported by expert opinions (e.g. regarding the implemented safety measures). If the procedure is held as a ‘regular procedure’ (which is most likely), the blacklisting application is typically decided by twelve committee members. A two-thirds majority is required to blacklist the service. During the regular procedure, the service provider is entitled to send representatives (including legal counsel) to attend the committee hearing (i.e., in addition to submitting written arguments). It is strongly recommended that representatives attend the oral hearing in order to respond to questions from the committee, present exonerating factors, and address potential misunderstandings or errors of use. The committee usually makes a decision on the same day.
- Service providers established in the EU/EEA should closely follow the current developments on the country or origin defense which has recently been significantly strengthened by the European Court of Justice, the EU Commission and (most recently) two German courts. Depending on the facts, there is a realistic chance that German digital and media regulatory laws cannot be applied anymore. This should be closely evaluated as an additional defense strategy.
- The same applies to service providers that are subject to the DSA, which includes many AI services. According to the EU Commission and a recent German court decision, the DSA takes full precedence over national youth protection laws, which form the legal basis for Germany’s blacklisting procedure, including its legal consequences and sanctions. While this does not protect against enforcement under the DSA or the consequent requirement to implement appropriate security measures, it could be used as strong grounds for arguing that German youth protection laws (on which the blacklisting procedure is based) do not apply anymore. This should be carefully evaluated.