Analyzing critical legal trends and developments across data, cyber, AI and digital regulations from around the world and beyond borders

As covered in our previous client alert, Singapore’s Prime Minister Lawrence Wong had on 1 October 2024 announced the establishment of a new government agency dedicated to assisting victims of online harms, in particular harms related to cyber bullying, deepfakes and the non-consensual sharing of intimate images.

As an update, on 7 March 2025, Minister for Digital Development and Information Josephine Teo announced that the new agency will be named the Online Safety Commission, and will begin operations in the first half of 2026. The Online Safety Commission will be established through a new proposed law, the Online Safety (Relief & Accountability) Bill, to be introduced later this year.


In more detail

The Online Safety Commission and the Online Safety (Relief & Accountability) Bill aim to reduce the time taken for victims to receive help for online harms encountered online. In particular, the Online Safety Commission is intended as a “one-stop shop” for victims, who will be able to request the Online Safety Commission to issue directions to platforms to take down offensive content. Importantly, the Online Safety Commission will also be empowered to require the removal of any existing identical copies of such content.

Victims can also request information about perpetrators from the Online Safety Commission should they wish to take legal action against these wrongdoers.

These developments add to existing avenues to victims in Singapore, including under the amended Broadcasting Act and Protection from Harassment Act. The amended Broadcasting Act enables the Government to order app stores and social media services to remove specified harmful content, and the Protection from Harassment Act enables victims to take known perpetrators to court and seek compensation.

Key takeaways

The Online Safety Commission and Online Safety (Relief & Accountability) Bill are the latest efforts by the Singapore Government to protect victims of online harms. Results from a recent public consultation reveal strong support for the proposed measures, seemingly primarily driven by demands to hold perpetrators accountable and deter harmful behaviour; the need to compensate victims; and the importance of platform accountability to users. 

Though there are questions of the balance to be struck between the responsibility of content creators and that of platforms, and how these developments will affect free speech and content creation, these measures provide additional means of recourse to victims and will help to improve the safety of online spaces.


For further information and to discuss what this development might mean for you, please get in touch with your usual Baker McKenzie contact.

Author

Andy Leck is the head of the Intellectual Property (IP) Practice Group and a member of the Dispute Resolution Practice Group in Singapore. He is a core member of Baker McKenzie's regional IP practice and leads the Myanmar IP Steering Committee.

Author

Ren Jun Lim represents local and international clients in both contentious and non-contentious intellectual property matters. He also advises on a full range of healthcare, as well as consumer goods-related legal and regulatory issues.

Author

Ken Chia is a member of the Firm’s IP Tech, International Commercial & Trade and Competition Practice Groups. He is an IAPP Certified International Privacy Professional (FIP, CIPP(A), CIPT, CIPM) and a fellow of the Chartered Institute of Arbitrators and the Singapore Institute of Arbitrators. His practice focuses on IT, telecommunications, intellectual property, trade and commerce, and competition law matters.

Author

Sanil is a local principal in the Intellectual Property & Technology Practice Group in Baker McKenzie Wong & Leow.

Author

Daryl Seetoh is a local principal in the Intellectual Property & Technology (IPTech) Practice Group at Baker McKenzie Wong & Leow.