We posed as a TikTok teen… and suicide posts appeared within minutes: How vulnerable teenagers are being bombarded with a torrent of self-harm and suicide content within minutes of joining the platform

  • Vulnerable teens being bombarded with self-harm and suicide TikTok content
  • Daily Mail, posing as 14-year-old, shown posts on suicidal thoughts in minutes 
  • Harmful recommended content was near identical to that shown to Molly Russell
  • Within 24 hours, account was shown more than 1,000 videos on harmful topics
  • You can call The Samaritans for free on 116 123, email them at [email protected] 

<!–

<!–

<!–<!–

<!–
(function (src, d, tag){ var s = d.createElement(tag), prev = d.getElementsByTagName(tag)[0]; s.src = src; prev.parentNode.insertBefore(s, prev); }(“https://www.dailymail.co.uk/static/gunther/1.17.0/async_bundle–.js”, document, “script”));
<!– DM.loadCSS(“https://www.dailymail.co.uk/static/gunther/gunther-2159/video_bundle–.css”);
<!–

Vulnerable teenagers are being bombarded with a torrent of self-harm and suicide content on TikTok within minutes of joining the platform, a Daily Mail investigation found.

An account set up by the Daily Mail as 14-year-old Emily was shown posts about suicidal thoughts within five minutes of expressing an interest in depression content.

After 15 minutes of scrolling, its algorithm was offering advice on the disposal of razor blades after self-harming, and demonstrating how to hide content from parents.

Vulnerable teenagers are being bombarded with a torrent of self-harm and suicide content on TikTok within minutes of joining the platform, a Daily Mail investigation found

Vulnerable teenagers are being bombarded with a torrent of self-harm and suicide content on TikTok within minutes of joining the platform, a Daily Mail investigation found

An account set up by the Daily Mail as 14-year-old Emily was shown posts about suicidal thoughts within five minutes of expressing an interest in depression content

An account set up by the Daily Mail as 14-year-old Emily was shown posts about suicidal thoughts within five minutes of expressing an interest in depression content

Within 24 hours, the account had been bombarded with more than 1,000 videos about depression, self-harm, suicide and eating disorders. Some videos had millions of views.

Much of the harmful content recommended by TikTok was almost identical to that which Molly Russell viewed on other platforms, such as Instagram and Pinterest.

TikTok does not allow self-harm and suicide content if it ‘promotes, glorifies or normalises’.

But it does allow such material if users are sharing their experience or raising awareness, which the platform admits is undoubtedly a ‘difficult balance’.

Instagram followed a similar policy until 2019 when it banned all such content in the wake of 14-year-old Molly’s tragic death.

Last night, the NSPCC said it was an ‘insult to the families of every child who has suffered the most horrendous harm’ that such content was available on the day Molly’s inquest ended.

After 15 minutes of scrolling, its algorithm was offering advice on the disposal of razor blades after self-harming, and demonstrating how to hide content from parents

After 15 minutes of scrolling, its algorithm was offering advice on the disposal of razor blades after self-harming, and demonstrating how to hide content from parents

Richard Collard, NSPCC policy and regulatory manager for child safety online, added: ‘All platforms need to ensure content being targeted at children is consistent with what parents rightly expect to be safe.’

TikTok has become the most popular site for young people. Nearly half of eight to 12-year-olds use the video-sharing platform in the UK – despite them being below the age limit of 13.

The platform says the algorithm – based off ‘user interactions’ – becomes more accurate the longer you stay on as the more data you give it allows it to hone in on your exact interests.

It took the Mail five minutes to set up a profile as 14-year-old Emily – with no checks to ensure the user had been honest. 

A reporter in control of the account then searched for terms such as ‘depression’ and ‘pain’, and followed any related accounts – including those that said 18+ in their description.

Within 24 hours, the account was being recommended shocking self-harm and suicide content.

One video, seen by more than 200,000 people, offered the following advice to viewers contemplating suicide: ‘People… won’t understand why you did it. So leave them a note and tell them it’s not their fault.’

The comments on another video, titled How To Attempt Sewer Slide [slang for suicide to avoid detection by moderators] With No Pain, saw users share tips on killing themselves.

A coroner has concluded schoolgirl Molly Russell (pictured) died after suffering from 'negative effects of online content'

A coroner has concluded schoolgirl Molly Russell (pictured) died after suffering from ‘negative effects of online content’

TikTok’s internal search function also used Emily’s viewing history to predict what she was searching for.

Typing ‘How many para’ resulted in ‘How many paracetamol to die for teens’.

TikTok said an initial investigation found the ‘majority of content this account was served’ did not violate its guidelines and the videos that did were removed before users had reported them or they received views – all within 24 hours of being posted.

A spokesman added: ‘While this experiment does not reflect the experience most people have on TikTok, and we are still investigating the allegations made to us by the Daily Mail this afternoon, the safety of our community is a priority. We do not allow content that promotes or glorifies suicide or self-harm.

‘These are incredibly nuanced and complex topics, which is why we work with partners including the International Association for Suicide Prevention and Samaritans (UK).’

Additional reporting by Isabelle Stanley and Niamh Lynch

Molly Russell Q&A: What did we learn from the inquest and what did the social media giants say?

The family of Molly Russell’s five-year wait for answers has finally ended as an inquest heard how the teenager viewed suicide and self-harm content from the ‘ghetto of the online world’ before her death in November 2017.

The head of health and wellbeing at Instagram’s parent company Meta and the head of community operations at Pinterest have both apologised for content Molly viewed on the platforms during the proceedings.

Here, the PA news agency looks at what we have learned during the 14-year-old’s inquest.

– Who gave evidence from the witness box at the inquest?

Molly Russell’s father, Ian, delivered a pen portrait of his daughter, before giving evidence.

Molly Russell's father, Ian, delivered a pen portrait of his daughter, before giving evidence

Molly Russell’s father, Ian, delivered a pen portrait of his daughter, before giving evidence

The head of health and wellbeing at Meta, Elizabeth Lagone and Pinterest’s head of community operations, Judson Hoffman, also appeared in person at the inquest.

Other witnesses included child psychiatrist Dr Navin Venugopal, Molly’s headteacher Sue Maguire and deputy headteacher Rebecca Cozens.

– What did Ian Russell say during his evidence?

Ian Russell said the content his daughter had been exposed to was ‘hideous’, adding that Molly had accessed material from the ‘ghetto of the online world’.

Mr Russell also said ‘whatever steps have been taken (by social media companies), it’s clearly not enough’, adding: ‘I believe social media helped kill my daughter.’

– What did Pinterest’s senior executive tell the inquest?

Pinterest’s head of community operations, Judson Hoffman, conceded the platform was ‘not safe’ when Molly Russell used it, adding that he ‘deeply regrets’ posts the teenager viewed.

Mr Hoffman said Pinterest is ‘safe but imperfect’ as he admitted harmful content still ‘likely exists’ on the site.

– What did Meta’s head of health and wellbeing tell the inquest?

Elizabeth Lagone, a senior executive at Meta, defended Instagram and said posts described by the Russell family as ‘encouraging’ suicide or self-harm were safe.

The senior executive told the inquest she thought it was ‘safe for people to be able to express themselves’, but conceded a number of posts shown to the court would have violated Instagram’s policies.

– What did the child psychiatrist say?

Dr Navin Venugopal said he was ‘not able to sleep well’ after viewing Instagram content viewed by Molly Russell.

The witness told the inquest he saw no ‘positive benefit’ to the material viewed by the teenager before she died.

– What did the headteacher of Molly Russell’s school tell the inquest?

Sue Maguire, headteacher at Hatch End High School, said social media causes ‘no end of issues’ as it is ‘almost impossible to keep track’ of it.

She told the inquest social media creates ‘challenges… we simply didn’t have 10 years ago or 15 years ago’.

– What was said about Molly Russell’s activity on Instagram?

The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.

The court was played 17 clips the teenager viewed on Instagram – prompting ‘the greatest of warning’ from Coroner Andrew Walker to those present.

But in the last six months of her life she was engaging with Instagram posts about 130 times a day on average.

This included 3,500 shares during that timeframe, as well as 11,000 likes and 5,000 saves.

– What was said about Molly Russell’s activity on Pinterest?

The court was told Molly created two ‘boards’ on Pinterest of interest to the inquest – with one called Stay Strong, which tended to ‘have more positive’ material pinned to it, while the other, with ‘much more downbeat, negative content’, was called Nothing to Worry About.

Molly saved 469 pins to the Nothing to Worry About board and 155 pins to the Stay Strong board.

The inquest was told Pinterest sent emails to the 14-year-old such as ’10 depression pins you might like’ and ‘new ideas for you in depression’.

– What was said about Molly Russell’s activity on Twitter?

The inquest heard Molly reached out to celebrities for help on Twitter, including YouTube star Salice Rose and actress Lili Reinhart.

The court was told the teenager used an anonymous account to send tweets to celebrities.

 

<!—->

Advertisement

Advertisement

Source: Daily Mail

You May Also Like

My first time at an archery class: ‘There’s so much tension it’s like a romantic comedy’

I arrive at Sydney Olympic Park Archery Centre for my one-on-one lesson…

A modern pilgrimage’s transformative power | Letters

Re your editorial (The Guardian view on pilgrimage: a 21st-century spiritual exercise,…

Renault sent me an £11,000 bill to repair my Zoe heater

I read with interest your recent letter. Unfortunately, I am having a…