Molly Russell, 14, killed herself while suffering ‘negative effects of social media content’ that a child ‘shouldn’t have been able to see’, coroner rules in landmark inquest
- For help or support, visit samaritans.org or call Samaritans for free on 116 123
<!–
<!–
<!–<!–
<!–
(function (src, d, tag){ var s = d.createElement(tag), prev = d.getElementsByTagName(tag)[0]; s.src = src; prev.parentNode.insertBefore(s, prev); }(“https://www.dailymail.co.uk/static/gunther/1.17.0/async_bundle–.js”, document, “script”));
<!– DM.loadCSS(“https://www.dailymail.co.uk/static/gunther/gunther-2159/video_bundle–.css”);
<!–
Social media posts about depression and suicide contributed to Molly Russell killing herself and ‘shouldn’t have been available for a child to see’, a coroner ruled today.
In a landmark conclusion that could have major implications for Big Tech, senior coroner Andrew Walker slammed social media giants for allowing ‘unsafe’ content promoting suicide and self-harm that fuelled the 14-year-old’s depression.
The inquest painted a grim portrait of the lonely and toxic nature of the ‘ghetto of the online world’, including the heartbreaking revelation that a desperate Molly turned to celebrities for help not realising there was little prospect of a reply.
After weeks of hearings exposed the utter failure of social media giant Meta and Pinterest to clean up the sewer of toxic content they have allowed to flourish, they were today told to ‘get a moral compass and step up’.
Molly, from Harrow in north west London, died in 2017 after hiding her ‘demons’ from her family while viewing thousands of harrowing posts and videos. She saved and ‘liked’ 16,300 images on her Instagram account, 2,100 of which related to depression, self-harm and suicide in the last six months of her life.
In a conclusion at North London Coroner’s Court, senior coroner Andrew Walker said: ‘Molly was at a transition period in her young life which made certain elements of communication difficult.’
He added that the schoolgirl ‘exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness’.
Mr Walker said he did not ‘think it would be safe’ to leave suicide as a conclusion for himself to consider, instead finding Molly ‘died from an act of self-harm while suffering depression and the negative effects of online content’.
The coroner said some of the content Molly viewed was ‘particularly graphic’ and ‘normalised her condition,’ focusing on a ‘limited’ view without any counter-balance.
In his conclusion, he noted how Molly had turned to celebrities for help while she was suffering from depression, not realising there was little prospect of a reply, a coroner has said.
Some of the content she viewed online ‘romanticised acts of self-harm’ and, due to the use of algorithms, she was exposed to some images, video clips and text that she had not requested, he said.
The 14-year-old from Harrow, northwest London , died in 2017 after hiding her ‘demons’ from her family while viewing thousands of harrowing posts and videos
Since her death, Molly’s family (including her father, Ian – pictured today) have campaigned for better internet safety and stronger rules over suicide and self-harm-related content
Mr Walker said the content on Pinterest and Instagram that Molly viewed before her death was ‘not safe’.
‘They allowed access to adult content that should not have been available for a 14-year-old child to see,’ he said.
‘The way that the platforms operated meant that Molly had access to images, video clips and text concerning or concerned with self-harm, suicide or that were otherwise negative or depressing in nature.
‘The platform operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text – some of which were selected and provided without Molly requesting them.
‘These binge periods, if involving this content, are likely to have had a negative effect on Molly.’
Since her death, Molly’s family have campaigned for better internet safety and stronger rules over suicide and self-harm-related content.
Ahead of his verdict today, Mr Walker said the ‘risk’ the internet had brought to homes across Britain must be recognised – and ‘kept away from children completely’.
He said he would not be allowed to make recommendations – but he did want to ‘raise concerns’ over the use of social media by children. This included algorithms that push harmful content on children and the lack of separation between under-18s and adults on the platforms.
His comments came as the children’s commissioner warned children were still being bombarded on social media with content promoting self-harm.
Warning it could lead to a repeat of the Molly tragedy, Dame Rachel de Souza published research showing 45 per cent of children had seen harmful content online.
However only half of the respondents aged eight to 17 who saw such material – which included self-harm and suicide – reported it. After hearing six days of evidence, Mr Walker said: ‘It used to be the case that when the child came through the front door of their home it was to a place of safety.
‘With the arrival of the internet, we brought into our homes a source of risk and we did so without appreciating the extent of that risk.
‘If there is one benefit that can come from this inquest it must be to recognise that risk and make sure that that risk we have so embraced in our homes is kept away from children completely.’
He added: ‘This is an opportunity to make this part of the internet safe and we must not let it slip away. We must do it.’
Oliver Sanders KC, the family’s lawyer, yesterday accused Instagram and Pinterest of being ‘ignorantly blind’ as to their part in Molly’s death.
Sir Peter Wanless, NSPCC Chief Executive, said today: ‘Finally Molly’s family have the answers they deserve thanks to their determination to see Meta and Pinterest questioned under oath about the part they played in their daughter and sister’s tragic death.
‘The ruling should send shockwaves through Silicon Valley – tech companies must expect to be held to account when they put the safety of children second to commercial decisions. The magnitude of this moment for children everywhere cannot be understated.
‘Molly’s family will forever pay the price of Meta and Pinterest’s abject failure to protect her from content no child should see, but the Online Safety Bill is a once in a generation opportunity to reverse this imbalance between families and Big Tech.
‘This must be a turning point and further delay or watering down of the legislation that addresses preventable abuse of our children would be inconceivable to parents across the UK.’
For help or support, visit samaritans.org or call Samaritans for free on 116 123.
Social media giants are told to ‘get a moral compass and step up’ following damning inquest conclusion
The Children’s Commissioner has blasted social media giants following the inquest into the death of Molly Russell – telling them to ‘get a moral compass and step up’.
Dame Rachel de Souza told the PA news agency that platforms such as Instagram and Pinterest need to ‘do more and be better’, describing some of the evidence seen at the inquest as ‘harrowing’.
Molly, from Harrow in north-west London, died in November 2017, prompting her family to campaign for better internet safety.
Ms de Souza said she wanted the Online Safety Bill to be implemented to ‘enshrine children’s safety in law’.
The commissioner said: ‘What the court has been hearing and what has been reported is absolutely harrowing.
‘We’re talking about thousands and thousands of posts we know she liked – the most awful pro-suicide and self-harm (content).
‘We have to think about a little girl who is depressed and upset, seeing those images – it’s just horrendous.’
Ms de Souza continued: ‘If it was just Molly and it was just a one-off it would be bad enough, but we know children are seeing these things across the country, many of them underage on those social media sites, and it’s just not good enough.
‘I’ve looked at lots of it and it’s deeply, deeply concerning.
‘As someone who advocates for children, as a parent myself, as someone who has been a headteacher and teacher of thousands of children for 30 years, it’s just horrendous that they are seeing this stuff.’
Commenting on Meta’s defence of some of the material viewed by Molly, where a senior executive said posts were safe because they were from people issuing a ‘cry for help’, Ms de Souza said: ‘It’s just not acceptable.
‘It’s not acceptable to hear this from social media companies – and I’m not surprised to hear that argument put forward.
‘Frankly, although they are attempting to make improvements they are not doing enough, and arguments like that I think are morally reprehensible and I want to see these companies step up and do more and be better.’
READ RELATED: Zac Efron revealed he suffered a 'pretty bad depression' because of his Baywatch body
The commissioner added: ‘Everyone has responsibility here, parents have responsibility, children know when they shouldn’t be online, but frankly – tech firms have huge agency here and we need them to do more.
‘They should remove self-harm and suicide content.
‘I think they should be eradicating graphic content depicting self-harm and suicide – there’s no defence at all for that material being accessible to children.
‘They’ve had since 2017, it’s still as bad and that’s why I think we need the Online Safety Bill back to enshrine children’s safety in law.
‘Self-regulations have just not been good enough.’
Ms de Souza said too many underage children were using social media.
She said: ‘I’m not convinced they are researching this, nor am I convinced their screening tools are working – so they need to remove that content.
‘Many of them are making moves with thinking about the Online Safety Bill coming – but platforms cannot protect children if they don’t know who the children are.
‘There are an enormous number of underage children using social media and platforms must work to develop effective age-assurance technologies that prevent children from accessing inappropriate and harmful content.’
She continued: ‘The tools they’ve got… they need to make sure (children) are not seeing images they shouldn’t see – and if they are too young they shouldn’t be on their sites at all.
‘There have been years of conversations about this, there is an immense amount of evidence about these harms that children are experiencing, so I think it really needs to happen now.’
Commenting on the attitude of social media companies towards the safety of children, the commissioner said: ‘I think I have seen improvements but it’s not good enough. It’s just not good enough.
‘The Molly Russell case really brings that into sharp relief and, as I say, I think they need to get a moral compass and step up.’
Source: Daily Mail