Cyber bullying on the rise in Singapore: MDDI survey

A recent survey by the Ministry of Digital Development and Information (MDDI), found that about three-quarters (74 percent) of respondents encountered harmful content like cyber bullying and sexual content on social media services designated by the Infocomm Media Development Authority (IMDA) under the Code of Practice for Online Safety, an official news release stated.

Representational image. Photo courtesy: Unsplash
Representational image. Photo courtesy: Unsplash

Six in 10 (61 percent) who encountered such content ignored it. About one-third (35 percent) blocked the offending account or user, and only one-quarter (27 percent) reported it to the platform.

The annual Online Safety Poll was conducted with 2,098 Singapore respondents aged 15 years old and above in April 2024. It aimed to understand the experiences of Singapore users with harmful online content, and their action to address such content.

Prevalence of harmful online content on social media services

Overall, the percentage points increased from 65 percent in 2023.

In terms of platforms, two-thirds (66 percent) of respondents encountered harmful content on designated social media services. This is up from 57 percent in 2023. In comparison, 28 percent of respondents encountered harmful content on other platforms, similar to last year’s level.

Among respondents who encountered harmful content on designated social media services, close to 60 percent cited encounters on Facebook, while 45 percent cited encounters on Instagram. While the prevalence of harmful content on these platforms may be partially explained by their bigger user base compared to other platforms, it also serves as a reminder of the bigger responsibility these platforms bear.

Encounters with harmful online content across social media services. Photo courtesy: MDDI
Encounters with harmful online content across social media services. Photo courtesy: MDDI

Types of harmful online content on social media services

Cyberbullying (45 percent) and sexual content (45 percent) remained the most common types of harmful content encountered on designated social media services. However, there was a notable increase in encounters with content that incites racial or religious tension (+13 percent) and violent content (+19 percent), compared to last year.

Most common types of harmful online content on social media services. Photo courtesy: MDDI
Most common types of harmful online content on social media services. Photo courtesy: MDDI

Reporting of harmful online content to social media services

Among those who reported harmful content to the platforms, 8 in 10 (78 percent to 86 percent) experienced issues with the reporting process. Top issues cited were that the platform:

a) Did not take down the harmful content or disable the account responsible;

b) Did not provide an update on the outcome; and

c) Allowed the removed content to be reposted.

For respondents who did not report harmful content to the platforms, the most commonly cited reasons were that they:

a) Did not see the need to take action (28 percent – 51 percent across the designated social media services);

b) Were unconcerned about the issue (29 percent – 45 percent across the designated social media services); and

c) Believed that making a report would not make a difference (26 percent – 37 percent across the designated social media services).

Whole-of-society efforts to tackle online harms

Given the complex, dynamic and multi-faceted nature of online harms, the government, industry, and people must work together to build a safer online environment.

As part of this holistic approach, the Government has taken several legislative steps to protect Singaporeans from online harms. For example,

a) In February 2023, amendments to the Broadcasting Act took effect, enabling the Government to swiftly disable access to egregious content on the designated social media services.

b) In July 2023, the Code of Practice for Online Safety came into effect. Among other things, it requires designated social media services to have in place systems and processes to minimise children’s exposure to inappropriate content and provide tools for children and their parents to manage their safety.

c) Earlier this month, Minister for Digital Development and Information, Josephine Teo, said that a new Code of Practice for App Distribution Services (or app stores as they are commonly referred to), which will require designated app stores to implement age assurance measures, will be introduced. This is to minimise Singapore users’ exposure to harmful content and protect children from inappropriate content. More details will be shared in due course.

Beyond the Government’s legislative moves, the survey findings showed that there is room for all stakeholders, especially designated social media services, to do more to reduce harmful online content and to make the reporting process easier and more effective.

As part of the requirements under the Code of Practice for Online Safety, designated social media services are due to submit their first online safety compliance reports by end-July 2024. It will provide greater transparency to help users understand the effectiveness of each platform in addressing online harms. The IMDA will evaluate their compliance and assess if any requirements need to be tightened.

Users also need to do their part to proactively act against harmful online content by reporting to the platform. Workshops, webinars, and family activities are organised in support of IMDA’s Digital for Life movement, to equip users with knowledge and tools to keep themselves and their children safe online. These online resources and schedule of relevant activities are available on https://www.digitalforlife.gov.sg.