Singapore has introduced a new bill called the Online Safety (Miscellaneous Amendments) Bill, which aims to hold online content and communication platforms accountable for egregious or harmful content disseminated to users in Singapore.
To be added to the existing Broadcasting Act, which regulates the dealings, operation of and ownership in broadcasting services in Singapore, the measures aim to:
- Provide a safe online environment for Singapore users;
- promote responsible online behaviour;
- deter objectionable online activity;
- prevent access to harmful content; and
- place priority on the protection of Singapore users, in particular young children.
When passed, the bill will, in part, give Singapore’s Infocomm Media Development Authority (IMDA), a statutory board under the Singapore Ministry of Communications and Information, the power to block or remove posts that are consider egregious or harmful.
Egregious and harmful content includes information that advocate suicide, self-harm, child sexual exploitation, terrorism and materials that may incite racial or religious tensions or pose a risk to public health. This, however, does not pertain to communications between two or more users in a private or domestic setting.
For major social media platforms, the implications are the most significant. The new bill will empower regulators to require online content and communication platforms with significant reach in Singapore to comply with Codes of Practice.
Social content and communication platforms including Meta, which owns and operates popular social sites and apps like Facebook and Instagram, TikTok and Twitter will be regulated in a manner that enables public interest considerations to be addressed. These platforms will be expected to undertake reasonable steps to comply with the Codes of Practice and are liable to financial penalties for non-compliance.
Additionally, the proposed bill will also allow the IMDA to order online content and communications platforms to:
- disable access by users in Singapore to the content on its service;
- ensure specific accounts (such as a social media account or channel) that is communicating the egregious content to Singapore users are prohibited; and
- block access by Singapore users to the non-compliant service provider.
Parliament will debate the bill at its second reading in November.
While the new measures do not specify liabilities for corporate businesses that are disseminating content to their Singapore users via their social media platforms, there is a level of self-regulation that is required. It is important to note that there is existing legislation that protects Singapore users, this:
- The Protection from Online Falsehoods and Manipulation Act 2019; and
- The Foreign Interference (Countermeasures) Act 2021.
For an individual business, self-regulation ranges from self-monitoring for regulatory violations to proactive corporate social responsibility initiatives.
For smaller social media services that has the capacity to scale, they may also wish to establish or assess their policies and processes to ensure they are adequate to comply with the new bill.
More broadly, there is some concern that the new bill could weigh down small content creation and marketing businesses with new costs and additional administrative burdens. Content regulation, in any form, tends take a toll on brands’ social media strategy and engagement.
Content regulation has become front of mind for jurisdictions all around the world. If you are operating a social content platform and have communications being disseminated to Singapore users, reach out to Nicola Loh for guidance on Codes of Practice.
For further information, please contact:
JTJB Singapore Office
T: 6223 3477 / 9170 7925