• ISMA Global

Suicide Streamed on Facebook Appears on TikTok Video - What Now?

Australia update.


For those of us who cannot keep up with whatever new app sensation is taking over the internet, put in its simplest form: TikTok is a social video sharing app. Users post 15 to 60-second-long videos which are edited using animation, filters, music and special effects. With now over 500 million TikTok users, it is one of the world’s fastest growing social media apps.


As you swipe up, a magical billion dollar algorithm determines what video will appear next on your “For You Page” depending on:

  • user interactions (videos previously liked/shared/commented on)

  • video information (sounds/hashtags/captions)

  • device settings (language/country/gender).


Ronnie McNutt suicide


In September 2020, a graphic video of a man committing suicide spread across TikTok after having first been streamed on Facebook Live. Disturbingly, the video was being seen by users without them intending to view it. For parents, this is particularly concerning as the video was hidden amongst other harmless content.



Theo Bertram, TikTok’s European director of public policy gave evidence to the British Commons Committee for Digital, Culture, Media and Sport (DCMS), saying the video was used in a “coordinated attack” on the social video app a week after it was originally recorded.

We learned that groups operating on the dark web made plans to raid social media platforms, including TikTok, in order to spread the video across the internet

So what can we do to stop our children from viewing such abhorrent material online?

Julie Inman Grant, eSafety Commissioner, advised social media apps to use their current tools to detect and remove content much more quickly. The Commissioner even suggested a 30-second delay to allow apps to ensure that their algorithms have "time" to detect such harmful content. Unfortunately, this solution is neither helpful nor practical.


There are different actors to look at in this case: corporate and government. The Australian Government passed a bill called Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 after the Christchurch mosque shooting and live streaming in 2019. At that time, the bill seemed to be more reactive than responsive because there was no debate or consultation. The definition of “abhorrent violent conduct” in this legislation is defined when a person:

  • engages in a terrorist act;

  • murders another person; or

  • attempts to murder another person;

  • tortures another person;

  • rapes another person; or

  • kidnaps another person

The Criminal Code Amendment does not contemplate suicide or self-harm as abhorrent violent conduct.


On the other hand, Facebook has its Community Standards where they consider suicide and self-injury as content against their guidelines. In this case, the social media platform has failed to stop the spread of the distressing material of Ronnie McNutt suicide.

At this juncture the best advice is probably that given by US non-profit organisation Common Sense Media that advises users to, among other things, enable a restricted mode that helps to block mature content and to change the “allow others to find me” that prevents the account showing up in searches.

By Nabeeha Mohammed

© ISMA - 2020

  • w-facebook
  • Twitter
  • White LinkedIn Icon
  • YouTube