Violent TikTok videos: How to protect your kids
A violent viral video on TikTok sparked alarm among Australian schools and parents amid warnings that children could be exposed to serious harm online.
But the distressing video is one of several potential risks for young users on TikTok, and experts say parents should pay close attention to how their children are using the app, what challenges they're participating in, and who is using it to talk to them.
This is what you need to know about potential hazards on the popular video platform, and what you can do to protect young fans.
While the problem is not unique to TikTok, the spread of a graphic suicide video on the social network was particularly damaging.
TikTok's algorithm is praised for its ability to send anyone's content viral, no matter how many or how few followers they have.
But the spread of this violent video clip showed how that asset could also be a problem, as the algorithm shared the distressing footage across the platform.
TikTok also failed to find an effective way to remove all instances of the video on its site, with users reporting they had seen it many hours after it was first shared.
Australian eSafety Commissioner Julie Inman Grant said TikTok and other social media companies needed to learn how "to detect and remove this content much more quickly".
Unlike other social networks, including Facebook and Twitter, TikTok will not allow users to turn off autoplay for videos.
Parents can use a Restricted Mode in the app for their children, however, that promises to block "mature content".
TikTok is renowned for hosting "challenges" for users.
Some involve dances, and others involve potentially deadly behaviour.
Cyber safety expert Susan McLean said days before the latest violent video went viral on TikTok, young users had been fuelling the "Benadryl Challenge": a dangerous competition to swallow huge amounts of allergy medication.
Doctors warn consuming the pills in large doses can cause seizures and heart problems, and the 'competition' was last month blamed for the death of a 15-year-old girl in Oklahoma.
Other dangerous TikTok challenges have involved putting condoms over people's heads to form a tight seal, and putting a coin between a charger and a power outlet, in what has become a fire hazard.
Ms McLean says parents should ensure their child does not use TikTok before the recommended age of 13 years.
"If we got all the under 13s off the platform, we would be so far ahead," she said. "We would have fewer traumatised kids."
Just as parents were working out how to approach TikTok this week, the Australian Strategic Policy Institute (ASPI) released a new report into Chinese social network.
The report found TikTok had amassed a huge audience of nearly 700 million users, and classified it as a "powerful political actor with a global reach".
But the report claimed it was using this power to censor talk "on a range of political and social topics, while also demoting and suppressing content".
Subjects censored on TikTok included gay, lesbian and transgender issues, the report found, as well as the Uyghur - Chinese conflict, mention of Tiananmen Square Massacre, Tibetan independence, and some religious groups.
TikTok last year admitted it had suppressed videos shared by users who were disabled, overweight, or unattractive in what it said was a misguided attempt to prevent harm to people who were "susceptible to bullying or harassment".
The ASPI report recommended governments introduce a set of guidelines to apply to all social networks, regulating censorship, data privacy, and transparency.
One of the most common questions about TikTok is whether it leaks users' personal information.
Concerns about the platform's security are so high, US President Donald Trump ordered its Chinese owner ByteDance to sell its American arm to a US company by November 12 or face a nationwide ban.
The Indian Government has already banned access to the platform in its country, and Labor Senator Jenny McAllister, chair of the Senate's social media foreign interference inquiry, warned of "credible reports that TikTok takes more data than its users would expect".
The TikTok app currently requests access to a user's phone number, email address, and contacts, location, camera, microphone and files on their smartphone.
The app was also found to be accessing the clipboard in smartphones in the past; a problem uncovered after a recent Apple software update.
But TikTok Australian general manager Lee Hunter said the company's Chinese owner "does not share information on our users in Australia with any foreign government".
Long before TikTok was mentioned as potential security risk, child safety groups raised issues over pornography, swearing, and adult songs shared on the platform.
Arguably, more have been shared since the recent release of Cardi B's sexually suggestive song WAP, which has inspired adult dance challenges on the platform.
Sexual predators have also been found sending explicit messages to very young users on TikTok despite its community guidelines against using "public posts or private messages to harass underage users".
Commonsense Media recommends the app be used by children aged 15 years and over "mainly due to the privacy issues and mature content".
Originally published as Violent TikTok videos: How to protect your kids