1. Mr Speaker, I thank Members for their interest in the Bill. All 16 members who spoke have given their support, reflecting the broad consensus on the need and timeliness of the proposals.
2. Members raised many useful points which I will address.
Who the Bill will/will not cover
3. Let me start with clarifications on the types of services that the Bill will cover.
4. Ms Tin Pei Lin, Mr Louis Ng, and Mr Saktiandi Supaat asked what other types of services, besides Social Media Services, may be specified in the Schedule of Online Communication Services (OCS).
5. Dr Shahira Abdullah wanted to know how IMDA will decide which service providers to designate. She and Ms Tin also asked about updating our regulations to keep in step with new technologies. Like many Singaporeans we engaged, Members acknowledge the fast pace of change in the online landscape. We are therefore committed to updating our laws and regulations as frequently as necessary to keep them relevant and effective.
6. In terms of the “type” of services, we will prioritise those that are more widely used in Singapore, and where the safety risks have become or are becoming apparent.
a. IMDA will use various data sources on user trends in Singapore to aid these assessments.
b. The Government is actively studying several areas, but I seek Members’ understanding that it can be counter-productive to discuss them prematurely.
c. Let us instead better understand and characterise the issues, taking reference from regulatory attempts elsewhere, before moving to design a suitable set of interventions for Singapore.
d. For example, Mr Melvin Yong, Mr Gerald Giam and Mr Mark Chay asked about online gaming, whereas the Bill only covers social media services currently. We share their concerns about online gaming. We have been thinking about it and we will share more details when ready.
7. Within each specified OCS, which entities to designate will depend on how much reach or impact they have with Singapore viewers.
8. IMDA will consult services before designating them under the Bill, to ensure that designated services are clear on the requirements and are given the opportunity to provide input on the proposals laid out by IMDA. Mr Zhulkarnain Rahim, Mr Saktiandi Supaat, and Mr Leon Perera asked about the consultation process. Details will be set out on how IMDA will work closely with the designated services. Having built constructive relationships with many of these services over the years, we are confident the processes will be robust. The list of services to be designated eventually will be published by IMDA.
9. Ms Nadia Samdin, Mr Yam and Ms Tin Pei Ling asked why private communications have been excluded. The short answer is that there are legitimate privacy concerns, which Mr Gerald Giam also shares. But users are not without recourse. IMDA’s draft Code of Practice for Online Safety will require designated Social Media Services (SMSs) to provide easily accessible user reporting mechanisms throughout its service. If individuals encounter harmful messages or unwanted interactions in private messages when using these social media services, they could block the sender or report the sender to the service.
10. While we do not intend to police private communications, we are also aware that there are groups with very large memberships, which could be used to propagate egregious content, making them no different from non-private communication on a social media service. In such instances IMDA will be empowered to take the same actions against them.
a. Mr Louis Ng and Mr Zhulkarnain Rahim asked about the specific factors in determining whether communications are private, that could shield such services from complying with IMDA’s protective measures.
b. Labelling a group or communications as private does not make it so.
c. The Bill sets out a list of factors that must be considered collectively. For example, it may be possible to conclude that a social media group is public, even if that social media group has been set to “private” and requires the owner to grant permission before one can access the content, but the owner is indiscriminate in granting that access.
d. We will continue to study this issue closely with other agencies, industry, and international partners.
Types of content the Bill will/will not cover
11. Next, on what type of content the Bill will, or will not, address.
12. Mr Zhulkarnain Rahim asked whether drug abuse and other illegal activities will be covered. Mr Saktiandi Supaat highlighted a particular area in the online domain that is of growing concern to many users. Under “egregious content” as defined under the Bill, content that may cause risk to public health will be covered. Depending on the facts of the case, this may include drug-related content. IMDA’s Code of Practice for Online Safety also requires services to apply content moderation systems to vice and organised crime, including fraud and scam content.
13. Mr Gerald Giam and Mr Leon Perera asked whether the Bill will cover non-consensual sharing of intimate images, and Mr Louis Ng asked why content “likely to cause feelings of enmity, hatred, ill will or hostility” is applied only to racial and religious groups, and not other demographic segments such as gender.
14. To a large extent, the kinds of problematic content they have in mind will already be covered within the Bill.
a. Content that advocates or instructs on violence, including sexual violence to individuals will be covered.
b. IMDA’s draft Code of Practice for Online Safety will require services to assess and act on cyberbullying, including content that is likely to cause harassment, alarm or distress to the user, which Mr Leon Perera, and Mr Daryl David and also Mr Melvin Yong also emphasised the need for.
c. For cases of harassment, there may also be recourse under laws such as the Protection from Harassment Act (POHA).
15. To Mr Louis Ng’s question on providing more details of “harmful content” under the Code – IMDA has issued a set of draft Guidelines giving examples of the content covered, which will be finalised together with the Code.
16. Mr Leon Perera spoke at length about the problem of loot boxes in online gambling. Mr Mark Chay also raised this issue. This is an issue that falls under the Gambling Control Act. But since members raised it, I will briefly address it. The Government recognises the potential harms of loot boxes. This is why we made significant updates to the Gambling Control Act earlier this year to ensure that our laws are able to address emerging trends and products such as in-game loot boxes, which are monitored by the Gambling Regulation Authority. I invite Members to file a Parliamentary Question if they wish to discuss this issue in greater detail.
17. Mr Saktiandi also raised queries on content such as lifestyles that go against traditional norms of society, participation in foreign armed conflicts, animal cruelty, and commercialised nudity.
18. Should we go beyond concerns over safety of individuals and communities, to cover other types of content at this juncture? This has the same problem as if our proposals attempted to cover other types of services prematurely. The Bill will become unwieldy, our proposals lacking in focus, and the results likely ineffective.
19. Ms Nadia Samdin asked if we had considered streamlining all online related harms into the Online Safety Bill.
20. Our approach has been to identify and address specific areas of harm in a targeted manner. As to whether the laws will be consolidated later, that remains to be seen. At this time, it is more important that we put in place legislation that effectively addresses and combats the respective harms. For example, at the Committee of Supply Debates this year, the Ministry of Home Affairs (MHA) announced that it was studying potential levers to deal with criminal offences committed online. Work is in progress. These levers are envisioned to complement the provisions under the Online Safety (Miscellaneous Amendments) Bill.
Who decides what content is covered
21. This leads me to questions raised by quite a few MPs on how the types or thresholds of harmful or egregious content are determined, and whether a committee or deliberative body could be set up to formulate or review these thresholds.
22. The Government had consulted various stakeholders, including parents, community groups and industry representatives in arriving at the proposals in this Bill.
23. Egregious content can take many forms and exist in grey areas which can be difficult to define clearly. A case in point is Ms Nadia’s example of online forums for users to share their experiences with each other to deal with depression and anxiety, and to provide mutual support.
24. When assessing whether a piece of content is harmful or egregious, IMDA will take an objective approach, considering the context in which it is presented. If such content is educational in nature, or helps users to overcome these harms, naturally it will not be considered harmful or egregious. On the other hand, social media trends or challenges may sometimes appear innocuous – such as the “Milk Crate” challenge Mr Darryl David mentioned. But if they result in harm to users, such as by advocating or providing instructions on self-harm or suicide, they would be considered harmful.
25. If the concern is whether individual Social Media Services have done enough to curb exposure to harmful content, the Government will continue to consult widely across society and share the feedback with the companies.
26. When urgent action is needed, such as to remove offensive content that advocates violence towards certain communities or could cause serious injuries to them, IMDA must be able to act fast. In such situations, consultations with stakeholders are better done as part of after-action reviews.
27. To Mr Gerald Giam’s question, if services are aggrieved by IMDA’s regulatory decisions, they can appeal to the Minister. The Minister’s decision can be challenged on judicial review.
28. Mr Giam and Mr Perera sought assurances that the Bill will not be used to curtail democratic rights or freedom of expression. I stated in my opening speech that IMDA does not have unfettered ability to issue new Codes. The Bill clearly sets out the purposes for which IMDA can issue these Codes, which is recorded in Hansard. I would also like to remind Members of the overarching purpose of the Bill – that is, to provide a safe environment and conditions that protects online users, while respecting freedom of speech and expression as enshrined in Article 14 of the Constitution.
29. Let me also address a specific area that Mr Giam raised – the provisions on journalistic content in the UK’s draft Online Safety Bill.
a. I thank Mr Giam for his support of the Singapore Bill and his suggestion for Singapore to mirror the UK proposal. We are always watching developments internationally, and considering what would be useful in our context. I will make three brief points on Mr Giam’s suggestion.
b. First, the draft Bill in the UK has not been passed into law. The draft provisions have been through several revisions and are far from final.
c. Second, without going into detail, there have been criticisms that the provisions on journalistic content may be exploited by bad actors. It could inadvertently allow anyone, under the guise of being a “citizen journalist”, to communicate egregious content and expose users to harm.
d. Third, this Bill is about online safety. It has no interest in curbing legitimate journalistic content.
How will the provisions under the Bill be enforced
30. This brings me to my next point on enforcement, which several Members have raised.
31. I will explain the enforcement measures that the Bill provides for at each stage, and how these relate to the online service providers. IMDA will first assess if there are instances of non-compliance, either with the Code’s requirements, or with directions issued by IMDA. It does not matter whether there are management changes within the companies. Accountability resides with the legal entities. Where there is non-compliance, in general, IMDA will engage the Services to understand their reasons. This includes Services that do not have a corporate presence in Singapore.
32. Thereafter, if there is no meaningful response or mutually acceptable solution, and IMDA finds the Services to still be in breach of their obligations, measures such as financial penalties will be considered. Mr Desmond Choo asked if the penalties for non-compliance are too low to have sufficient impact or deterrence. The financial penalty quantum is comparable with other local legislation that cover social media services, such as the Foreign Interference (Countermeasures) Act (FICA) and the Protection from Online Falsehoods and Manipulation Act (POFMA). Services will also face reputational damage.
33. In the event that these still fail to address our serious concerns, IMDA may then issue a blocking direction to Internet Access Service Providers, to stop Singapore users from accessing these Services. But to Mr Zhulkarnain’s question, the purpose of section 45H(2)(b) is to ensure that this happens only if the platform had refused to comply with IMDA’s direction. This reflects our proportionate approach towards regulating content.
34. To Dr Shahira Abdullah’s question regarding the details of a blocking direction such as duration, this will depend on the individual case. Suffice to say it is a measure that IMDA will not take lightly. But IMDA’s resolve in protecting Singaporeans’ interests should not be tested.
35. Let me also address various technical questions from Members.
36. Mr Gan Thiam Poh asked how “Singapore users” will be determined. The OCS service providers will typically have geolocation data on whether a user accesses the Service from Singapore. This is common practice. Mr Gan and Mr Melvin Yong also asked how the Government would ensure that Singapore users are not exposed to harmful content given the use of VPNs. Just like fire codes cannot prevent people from playing with fire, neither can we shield people completely if they intentionally seek out harmful content online. Parents have a role to play, as do the individuals themselves as well as our wider society to be aware and vigilant.
37. Mr Zhulkarnain Rahim asked what we mean by “reasonably practicable” steps taken by the OCS to comply with IMDA’s direction. This requires the balancing of various considerations, such as the technology that is available to implement the direction.
38. Mr Zhulkarnain also asked about the proposed section 45J(2).
a. This provision ensures that compliance with IMDA’s directions does not cause a service provider to incur liability in Singapore, if for example the content creator takes issue with it.
b. IMDA’s concern is to protect users in Singapore and this Bill only requires action against content accessible in Singapore. Thus, this provision naturally only insulates against liability under Singapore law.
c. Since our measures are also proportionate to the harm and consistent with leading jurisdictions, it is unlikely that service providers will attract liability elsewhere for complying with IMDA’s directions in Singapore.
d. But we will monitor international developments and keep in mind his suggestions on reciprocal immunity.
39. Mr Zhulkarnain Rahim, Mr Alex Yam, Mr Saktiandi Supaat, as well as Mr Gerald Giam asked who will enforce the Bill, whether a dedicated new body such as an eSafety Commissioner will be set up, and whether the respective Government teams are sufficiently resourced.
40. I thank them for looking out for the teams working behind the scenes on online safety.
41. As I mentioned above, compliance assessments will be undertaken by the IMDA, which has both the experience and expertise in performing this role. If egregious content is flagged to IMDA, and IMDA assesses there is a need to act, action will be taken. All this will be a lot of work, but we will periodically review our resourcing to ensure that the team is able to carry out its responsibilities fully and effectively.
Empowerment of individual users
42. Members asked how individual users can provide feedback about problematic content or non-compliance.
43. I agree with Mr Saktiandi that users are effectively a wider pool of eyes who can help to identify and flag problematic content. Mr Alex Yam is also right to remind us that users must play a role in policing harms they may come across.
44. Users are indeed our first line of defence. This is why we expect social media services to take user reports seriously, and to ensure that their systems and processes are sufficiently robust.
45. Under IMDA’s draft Code of Practice for Online Safety, designated services will be required to provide effective, transparent, easy-to-access, and easy-to-use reporting mechanisms to all individuals.
46. This is a more effective way, to tackle voluminous online content at source.
a. In turn, users expect that social media services assess their reports and take appropriate action in a timely and diligent manner.
b. Services will be required to include information on these actions in their annual reports.
c. With this information, IMDA will be able to assess the adequacy of the Service’s measures.
d. Audits may also be undertaken to ensure compliance.
47. Mr Gerald Giam and Mr Saktiandi Supaat asked for the social media services to submit reports at a higher frequency than annually to establish the services’ effectiveness in acting on user reports. As a start, IMDA intends for the reports to be submitted annually and this can be reviewed later on.
Timelines to act on directions and user reports
48. Given the speed at which harmful or egregious content can be amplified and spread online, the speed of action must be proportionate to the potential harm of the content identified.
49. Members asked about the timelines for Services to act on directions issued by IMDA or to respond to user reports.
50. IMDA’s directions will stipulate a specific timeline for disabling access. For egregious content that could cause serious harm, the timeline would generally be within hours.
51. IMDA will also require social media services to act on user reports in a timely and diligent manner that is proportionate to the severity of the potential harm. In particular, timelines must be expedited for content and activity related to terrorism.
Protection for young users
52. Members have expressed concerns about the impact of harmful online content on young users. Ms Janet Ang and Mr Gerald Giam raised the need to leverage technology to combat harmful online content, including through setting default content restriction settings for young users. We understand and share these concerns.
53. Therefore, IMDA’s draft Codes will put in place additional safeguards to protect young users, including minimising their exposure to inappropriate content, and providing tools for children or their parents to manage their safety online.
a. The Code also requires that services provide differentiated accounts to children, whereby safety settings are robust and set to more restrictive levels that are age-appropriate by default.
b. Children and their parents or guardians must be provided clear warnings of implications if they opt out of the default settings.
c. We will continue working with industry players to see how such measures can be strengthened.
54. In practice, users might try to circumvent these measures.
a. Mr Desmond Choo, Mr Melvin Yong, Mr Gerald Giam, as well as some respondents to MCI’s Public Consultation in July, have asked about the possibility of requiring age verification systems.
b. Mr Saktiandi Supaat, Mr Alex Yam, Mr Mark Chay and Mr Melvin Yong asked about measures to better protect the young, including age-specific provisions or mandating screen time restrictions.
55. Most major social media services already require users to be at least 13 years old to register for an account.
a. Users have to declare their date of birth at the point of registration.
b. This way, services will be able to apply age-appropriate policies to their respective users, including content moderation.
c. In line with this, PDPC will be clarifying that personal data may be used to implement such age-appropriate policies on social media services.
56. To mitigate against false age declarations:
a. Some social media services use a combination of Artificial Intelligence, machine learning technology, and facial recognition algorithms to proactively detect and remove underage accounts.
b. Some also allow users to report accounts suspected to be underage, which will be investigated and suspended if the reports are accurate.
57. However, there is currently no international consensus on the standards for effective and reliable age verification by social media services which Singapore can also reliably reference.
a. Instead, we will continue to closely monitor and extensively consult on the latest developments in age verification technology, taking into account data protection safeguards, and consider viable regulatory options.
b. In addition, we will continue to work with Social Media Services, educators, and other stakeholders, to help parents guide young users navigating online spaces, and make young users better aware of the safety tools that are available to them.
Protection for individual user and victims
58. Members have also raised the importance of providing support to victims or users affected by online harms.
59. We recognise that while laws provide the necessary legal tools for victims, they can often be daunting and difficult to approach.
a. Members would be glad to know that organisations such as SG Her Empowerment, or S.H.E. have stepped up to augment Defence Guild’s efforts in providing legal support to victims of online abuse.
b. Continuing the work of the Sunlight AFA, which concluded its tenure in July this year, SHE is working with the Singapore Council of Women’s Organisations (SCWO) to launch a support centre for victims of online harm. Those in need may seek support and legal advice from counsellors and pro bono lawyers from the centre.
60. As I mentioned in my opening speech, online harassment, cyberbullying and doxxing are dealt with under the Protection from Harassment Act 2014 (POHA). Victims of gender-based online harms, of which a commonly known example is image-based sexual abuse, will be able to seek recourse under POHA where the online harm amounts to harassment. The Protection from Harassment Court has served many victims since it was established last year. The Ministry of Law is also looking into how victims can be better empowered to put a stop to such online harms generally, and to seek redress against, and hold accountable those responsible. This includes cyberbullying, and more novel forms of online hurt, such as cancel campaigns, which Minister Shanmugam has spoken about before. MinLaw’s efforts will complement MCI’s efforts to enhance the Government’s regulatory toolkit, as well as MHA’s efforts to address criminal offences committed online. More details will be announced at an appropriate juncture.
Public Education/Equipping parents and educators
61. Which leads me to my final point – that public education must come hand in hand with legislation. Ms Nadia Samdin, Mr Alex Yam, Mr Mark Chay, Mr Leon Perera, and Mr Zhulkarnain Rahim spoke about this.
62. Members also called for more collaboration with service providers in this area. For example, Mr Melvin Yong asked whether the Government would consider setting up a self-regulatory taskforce with key OCS providers. We can explore this suggestion when we engage further with the industry.
63. Speaker, I seek your permission to distribute a handout to the Members, which contain a list of safety measures on social media services, and public education programmes organised in collaboration with various technology companies and community partners.
64. To highlight a few examples:
a. Google held its Online Safety Park at the Digital for Life (DfL) Festival earlier this year, and is partnering the Media Literacy Council (MLC) to bring its “Be Internet Awesome” programme to primary schools to train 50,000 parents and children on online safety measures.
b. Meta collaborated with the National Crime Prevention Council and MLC on a campaign to educate users on top scam typologies and tips to keep safe. This campaign reached over 2 million users, and a second campaign has been launched on e-commerce scams.
c. There are many others, and we will continue to build on these efforts.
65. Speaker, may I make a few comments in Mandarin.
68. 《通讯网络安全法案》的目的就是为了让我国采取及时、有效的 “灭火”行动。
69. 尽管法律可以发挥作用，但不可能全面打击网络危害。因此，我国采取的是, 灵活应对的方针，按部就班的方式来应对瞬息万变的网络世界。
70. 更重要的是, 政府很清楚我们必须网罗各方伙伴，一起寻找解决方案。其中一个重要伙伴就是家长。但许多家长是在他们成年以后，才开始接触社交媒体。 网络世界变幻无常，让不少家长措手不及。
71. 因此，我们正与各方合作， 加强家长的网络安全意识和应对能力，譬如让家长清楚知道社交媒体服务的一些安全选项，以便让他们更好地引导孩子。
72. 虽然挑战颇大，只要上下同心，齐心协力， 我们还是可以为国人打造一个更安全、璀璨的网络空间。
73. Speaker, I have tried to respond to as many of the questions and suggestions as I can.
74. The Bill before us today seeks to create a safer online environment for Singapore users.
a. Users will be empowered with the tools to manage their own safety, and equipped with the information needed to make informed decisions about how they wish to use online services.
b. In turn, online services will be held accountable for their systems, processes, and actions.
c. And where there is egregious content, such as content that undermines racial and religious harmony, the Government will step in to protect users.
75. Ultimately, we must recognise that there is no single measure that will assure us of online safety. We will need laws, codes, education, user reporting and a whole range of interventions. We will also need to keep updating our measures to deals with new risks. I am heartened that Members are united on this, and I thank the House for its unanimous support.
76. Shared responsibility, parental guidance and active individual involvement will play a key role in ensuring that even in the face of harmful online content, users, including children, can stay safe online. This bill is a first step. We will continue to work with all of you, and our various partners, to keep our people safe online.
77. Speaker, I beg to move.
|PDF version of the speech