The Swiss Federal Council wants to reform the rights of users in the digital space and impose stricter obligations on major online services. At its meeting on 29 October 2025, the Federal Council launched a consultation on a draft law that would establish binding rules for very large communication platforms and search engines.
Motivation for the regulation
The Federal Council justifies the need for regulation by pointing out that communication platforms and search engines have become a key part of today’s digital infrastructure, but there is no specific regulation applicable in Switzerland.
While these platforms offer broad access to information and diverse viewpoints, they also pose risks. Unlawful content can be disseminated, and user content may be removed based on untransparent criteria. The proposed law seeks to balance user rights, public discourse and access to information.
Personal scope
The draft law focuses exclusively on very large platforms and search engines. These are defined as services used by at least 10% of the Swiss population (currently around 900’000 users) on average at least once per month, measured over a six-month period. In the following, we refer to these services simply as “Platforms.”
Key provisions of the draft law
The draft legislation places several key obligations on the Platforms, including the following:
- Reporting mechanisms: Platforms must provide users with accessible procedures to report suspected unlawful content.
- Offences for which a reporting procedure must be available include: Depictions of violence (art. 135 Swiss Criminal Code, “SCC”), defamation (art. 173 SCC), false accusation (art. 174 SCC), insult (art. 177 SCC), threat (art. 180 SCC), coercion (art. 181 SCC), sexual harassment (art. 198 SCC), public incitement to crime or violence (art. 259 SCC), discrimination and incitement to hatred (Art. 261bis SCC).
- The reporting procedure must be designed in such a way that the user can provide certain information specified in the draft law such as a justification for why they are reporting the content.
- Content moderation transparency: Platforms must inform users when content is removed or accounts are blocked or other restrictive measures are taken and explain the reasons behind these actions.
- Dispute resolution: Platforms must implement an internal, free-of-charge complaints procedure. Complaints must be reviewed under the supervision of a qualified individual and not handled solely through automated processes. Moreover, the Platforms must participate in out-of-court dispute resolution mechanisms.
- General terms and conditions: Platforms must provide certain mandatory information in their general terms and conditions. If Platforms impose restrictive measures on content provided by users, they must include amongst others: (i) information on which types of user-generated content are subject to restrictive measures; (ii) details on the nature and implementation of these measures. (iii) They must include information regarding the reporting mechanisms and the internal complaints procedure. (iv) The general terms and conditions must be written in plain and easily understandable language in German, French, and Italian. (v) Platforms must inform the users about relevant changes to their general terms and conditions and (vi) they must make both the general terms and conditions and a summary thereof publicly available and ensure that they are easily accessible. These requirements apply irrespective of the law applicable to the general terms and conditions. Moreover, if the Platforms use recommendation systems, certain specific information thereto must be set out in the general terms and conditions.
- Advertising and recommendation systems: Transparency obligations will apply to the declaration and targeting of advertisements and the functioning of recommendation algorithms. A publicly accessible advertising archive must be maintained, and authorities and researchers must be granted access to relevant data. If Platforms use recommendation systems, they must also offer at least one option for each system that does not rely on profiling as defined in the Swiss Federal Act on Data Protection.
- Reporting & evaluation: Platforms must submit an annual transparency report to the Federal Office of Communications (“OFCOM”). Moreover, they must provide an annual report about certain risk assessments amongst others concerning negative consequences for public opinion formation for electoral and voting processes, for public safety and order and for public health.
- Additionally, Platforms must, at their own expense, subject their services to an annual evaluation of compliance with obligations set out in the new law. Platforms must cooperate with the evaluation organizations to ensure that the evaluation can be carried out in a timely, effective, and efficient manner. This includes, in particular, granting access to all information and premises relevant to the conduct of the evaluation.
- Designation and access: Platforms must designate a contact point through which users and OFCOM can reach them quickly via electronic means in a Swiss official language (thus, German, French, Italian).
- Legal representative in Switzerland: Platforms headquartered abroad must appoint a legal representative in Switzerland to ensure enforceability of the law.
- Procedure: To identify the Platforms that fall under the scope of this law, OFCOM contacts providers it believes meet the criteria. The contacted providers must supply OFCOM with information on the number of users of their services. If the providers do not have a registered office in Switzerland, the request for information will be delivered through international administrative assistance. OFCOM publishes a list of the Platforms subject to this law. OFCOM collects an annual supervisory levy from these Platforms to cover the costs of its supervisory activities that are not covered by fees.
- Administrative measures and fines: OFCOM will have the right to take administrative measures such as e.g. ordering Platforms to terminate an infringing activity. If the measures prove ineffective or there is reason to believe they are ineffective, OFCOM may instruct telecommunications service providers to restrict access to a Platform. OFCOM may impose a financial penalty of up to 6 % of the average annual global turnover generated over the last three financial years on a Platform for certain listed infringements such as the violation of the reporting mechanism requirements. Moreover, OFCOM may impose a financial penalty of up to 10 % of the average annual national turnover generated over the last three financial years on a legal person, or up to CHF 100’000 on a natural person, if they violate the obligation to provide information to OFCOM in order for OFCOM to be able to undertake its supervision.
What is next?
Currently this draft law is only in the consultation process. Stakeholders are invited to submit their views by 16 February 2026. In addition to providing general feedback, the Federal Council explicitly asks them to include input regarding the protection of minors and the reporting mechanisms.
After the consultation procedure has been completed and the draft revised, the Federal Council adopts a dispatch (Botschaft des Bundesrates). This dispatch contains the Federal Council’s final draft bill, a detailed justification, comments on individual provisions, and references to specific points. The dispatch serves as the basis for discussion and decision-making in the federal chambers. After both chambers have adopted the draft, the final vote takes place. Subsequently, the law is generally subject to an optional referendum. If no referendum is requested or if the law is approved in a popular vote, it enters into force.
Resources
- Further information on this draft law can be found here (including a link to the draft legislation): https://www.news.admin.ch/en/newnsb/6TmEAde4htulaWG9CWYtK