Should you be the parent of a teenage girl, as I am, it’s likely that Covid isn’t the only pandemic you’re worried about. You’re probably equally concerned by the spread of eating disorders and body dysmorphia among young people, now that the numbers seeking help from the NHS have hit record highs.

It’s likely you’ve been heartbroken to hear your daughter turning against her own body, calling herself fat or ugly. Maybe you’ve tried to comfort other parents at their wits’ end because their child won’t eat or is obsessed with the gym.

This week it was confirmed that it’s no accident the soaring number of young people with disordered eating and warped body images coincides with just over a decade of Instagram. While it’s not the only contributing factor, according to a range of research, this type of social media — in which teens compare themselves to unrealistic, filtered images of influencers and each other — makes it worse.

And it turns out Instagram bosses knew their platform could be toxic for teenage girls despite publicly playing down the effect it had on them. According to leaked internal documents, studies by the photo-sharing app repeatedly found content was causing serious damage to many young users, sending some spiralling into eating disorders or depression. In other words this devastating epidemic has been created by greed in the boardrooms of Silicon Valley.

‘We make body image issues worse for one in three teen girls,’ read one slide from an internal presentation in 2019, leaked to the Wall Street Journal.

‘Teens blame Instagram for increases in the rate of anxiety and depression,’ announced another, said to have been seen by the company’s executives.

Among the most concerning findings reported by the company’s own internal research is that among UK users who reported suicidal thoughts, 13 per cent traced them back to Instagram.

Just like the tobacco industry knew the harm it was doing but played it down, Instagram — and its parent company Facebook — have also knowingly done so for the sake of profits.

Built around visuals, which get rewarded with likes, the revelations show Instagram fully knew it was shaping the way young girls believe their vulnerable, growing bodies should look.

Tanith Carey says Instagram turned a blind eye as girls waged war on their looks (stock photo)

Tanith Carey says Instagram turned a blind eye as girls waged war on their looks (stock photo)

Tanith Carey says Instagram turned a blind eye as girls waged war on their looks (stock photo)

It turned a blind eye while our daughters waged war on their own flesh with slimming challenges like ‘thigh gap’, ‘skinny b****’, and ‘A4 challenge’ — in which girls pictured themselves with waists as thin as the width of a piece of paper — trended.

While knowing full well a quarter of teens who felt ‘not good enough’ could trace it back to the network, Instagram hired expensive PR firms instead to imply that it was mainly up to users to moderate themselves and flag upsetting content, even though little happened when they did.

Over the years, as a parenting expert and author, I have done regular audits of the sort of material our daughters can see on the platform. For the Mail, I have also checked to see if Instagram has honoured its more recent promises to remove hashtags in which vulnerable young people incite each other to harm, starve and kill themselves.

For example, in 2019, I checked on the popularity of a hashtag which leads anorexic girls to pictures of each other’s emaciated and skeletal bodies (we won’t repeat it here). Three years ago, it brought up 3,714 results. This week, despite repeated promises that Instagram would clear out these dark and dangerous corners, it brought up 4,496.

It is exactly the sort of hashtag Instagram vowed to remove back in 2019 when I last looked. Yet the images it leads to have been left untouched, in plain sight, without so much as a sensitivity screen.

Instagram has not only been an accomplice to young people starving themselves but to them committing suicide, too. Girls like 14-year-old north London pupil Molly Russell. In 2019, her father Ian said Instagram had ‘helped to kill’ Molly after it emerged she had been looking at suicide and self-harm images on the social network in the run-up to taking her own life.

The British writer says Instagram knew the harm it was doing but played it down (stock photo)

The British writer says Instagram knew the harm it was doing but played it down (stock photo)

The British writer says Instagram knew the harm it was doing but played it down (stock photo)

Yet this week many of those self-harm images were still in plain view when I looked. And I found still more images of nooses, dangling feet and razor blades on feeds posted by young people telling others that no one cares, adults don’t understand and suicide is the only way out. So why has Instagram still not done enough despite knowing the harms? Only the company knows for sure.

One clue may be that 40 per cent of Instagram’s users are 22 years old and younger, and are vital to the company’s billions of dollars in annual revenue, according to The Wall Street Journal. If they admitted they had to take more responsibility for the content posted on the platform, they would have had to work harder — and pay more people (so eating into their profits) to remove it.

When asked this week, Instagram told the Mail it has 15,000 content reviewers across the world, the same number it gave two years ago, to cover a social network used by one billion.

Judging by the fact that I found content that should have been blocked and removed within moments for violating its own terms and conditions, that’s still not enough.

Meanwhile, Instagram claims the research ‘focuses on a limited set of findings’ and, in fact, shows their ‘commitment to understanding complex and difficult issues young people may struggle with’ and informs their ‘extensive work around bullying, suicide, self-injury, and eating disorders’.

Of course, it’s true that Instagram is just a platform. The tech itself isn’t inherently good or bad. As with any social media platform, it’s how people use it that distorts it.

Andy Phippen, professor of IT ethics and digital rights at Bournemouth University, told me this week: ‘Instagram itself isn’t harmful. It’s the content young girls see on Instagram that is. But if your echo chamber is full of peers sharing similar upsetting material, and no one unfollows or reports the content, it can be a downward spiral of negativity.’

Its lack of transparency led Molly’s father to this week call for MPs to hold social media bosses criminally liable if they do not protect children from suicide and self-harm content.

Now we’re aware Instagram has long known about the damage it has been doing, the question is surely: should that charge be manslaughter — or murder?

Tanith Carey is the author of What’s My Teenager Thinking? Practical Child Psychology For Modern Parents, with Dr Angharad Rudkin, DK.

For confidential support call Samaritans on 116123 or go to samaritans.org

Source: Daily Mail

You May Also Like

I find it hard to make friends – now my daughter does, too

The question I grew up in a household that was supportive and…

Gifts for fitness fans: what to give gym and yoga bunnies this Christmas

Fitness is a way of life for some people. Whether a weightlifter,…

The creatine conundrum: can it really help your muscles and your brain?

Until relatively recently, if you were mixing a scoop of powdered creatine…

Why can I only ever enjoy sex with a partner when we’re in the ‘honeymoon stage’?

I’m a 27-year-old woman. Each of my relationships starts out with a…