As covered in our previous client alert, Singapore’s Prime Minister Lawrence Wong had on 1 October 2024 announced the establishment of a new government agency dedicated to assisting victims of online harms, in particular harms related to cyber bullying, deepfakes and the non-consensual sharing of intimate images.
As an update, on 7 March 2025, Minister for Digital Development and Information Josephine Teo announced that the new agency will be named the Online Safety Commission, and will begin operations in the first half of 2026. The Online Safety Commission will be established through a new proposed law, the Online Safety (Relief & Accountability) Bill, to be introduced later this year.
In more detail
The Online Safety Commission and the Online Safety (Relief & Accountability) Bill aim to reduce the time taken for victims to receive help for online harms encountered online. In particular, the Online Safety Commission is intended as a “one-stop shop” for victims, who will be able to request the Online Safety Commission to issue directions to platforms to take down offensive content. Importantly, the Online Safety Commission will also be empowered to require the removal of any existing identical copies of such content.
Victims can also request information about perpetrators from the Online Safety Commission should they wish to take legal action against these wrongdoers.
These developments add to existing avenues to victims in Singapore, including under the amended Broadcasting Act and Protection from Harassment Act. The amended Broadcasting Act enables the Government to order app stores and social media services to remove specified harmful content, and the Protection from Harassment Act enables victims to take known perpetrators to court and seek compensation.
Key takeaways
The Online Safety Commission and Online Safety (Relief & Accountability) Bill are the latest efforts by the Singapore Government to protect victims of online harms. Results from a recent public consultation reveal strong support for the proposed measures, seemingly primarily driven by demands to hold perpetrators accountable and deter harmful behaviour; the need to compensate victims; and the importance of platform accountability to users.
Though there are questions of the balance to be struck between the responsibility of content creators and that of platforms, and how these developments will affect free speech and content creation, these measures provide additional means of recourse to victims and will help to improve the safety of online spaces.
For further information and to discuss what this development might mean for you, please get in touch with your usual Baker McKenzie contact.