The algorithm TikTok uses to decide what videos users will see is promoting sexual content, drugs and alcohol to children as young as 13, an investigation revealed.
As part of the Wall Street Journal investigation, a ’13-year-old user’ searched for ‘onlyfans’ and watched a handful of videos including two selling pornography on the China-based social media app.
When viewing the for you page, the TikTok version of the Twitter feed, the same teenage user was shown a series of sexually oriented videos.
The content shown on the for you page is set based on previous searches, as well as the content types most often viewed or that a user spends most time watching.
The more the user lingered on sexual content, the more sexual content was shown in the for you page – despite the users age being set in their profile.
TikTok said they don’t currently differentiate between videos served to adult and children accounts, but were working on a new filter tool for younger accounts.
The algorithm TikTok uses to decide what videos users will see is promoting sexual content (left), drugs (right) and alcohol to children as young as 13, an investigation revealed
It wasn’t a real 13-year-old involved in the investigation, rather a series of bots were created to better understand what TikTok shows its younger viewers.
All of the automated accounts were set to be between the ages of 13 and 15 to determine whether younger viewers get a different feed to older TikTok users.
Through the constant curation by the AI-powered algorithm, they found the feeds would become more focused, delving into increasingly inappropriate content.
One of the accounts, claiming to belong to a 13-year-old, was shown 569 videos about drug use including references to cocaine and meth addiction, as well as promotional videos for the online sales of drug products.
The WSJ accounts were also shown over 100 videos promoting pornography sites and sex shops from accounts labelled as adults only.
There were even videos encouraging eating disorders, promoting alcohol consumption and other adult orientated content.
The WSJ reporters sent TikTok almost 1,000 videos showing drugs, porn and other adult-related content that had been shown to their 13 to 15 year old bot accounts.
Of those videos, 255 were removed soon after they were sent to the Chinese-owned platform, including a dozen showing adults entering relationships with people calling themselves ‘littles’ – legal age adults pretending to be children.
One of the people shown in a ‘role-playing’ video aimed at adults, but shown to the teenage ‘bot’ accounts said she clearly says her content is 18 and above in her bio.
‘I do have in my bio that is 18+ but I have no real way to police this,’ she told the Wall Street Journal in a message, adding that she doesn’t agree with her content being shown to someone so young.
READ RELATED: Ex-BBC Radio 1 star Dianne Oxberry dies aged 51 from cancer
A TikTok spokesperson told the Journal that the firm removed some of the reported videos, and restricted distribution of others to stop them being recommended to other young users in the future – but it isn’t clear how many.
Pictured: a screenshot of drug-themed content show to the bot on TikTok
Under TikTok terms of service, users have to be at least 13, and if they are under 18 they need parental permission – with parents able to manage screen time and privacy settings for their children within the app.
‘Protecting minors is vitally important, and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens,” TikTok said.
There were a total of 31 ‘bot’ accounts used in the investigation, each given a date of birth and IP address, with no other information revealed to TikTok apart from what as shared by viewing various videos for different amounts of time.
A dozen of the 31 accounts quickly became dominated by a particular theme, as the selective algorithm tailored the for you page to previously expressed interests.
In one example, one account given a series of interests such as Japanese film and television cartoons, one streak of 150 videos delivered 146 Japanese animations including many with a sexual theme.
TikTok told the Journal that the behaviour of the bots ‘in no way represents the behaviour and viewing experience of a real person.
That doesn’t change the adult-themed videos being shown to children, with 40 per cent of one stretch of 200 videos coming from accounts labelled as adults only.
TikTok has to rely on algorithms and an army of 10,000 people to police the millions of videos shared to the service, including tens of thousands every minute.
As part of the Wall Street Journal investigation, a ’13-year-old user’ searched for ‘onlyfans’ and watched a handful of videos including two selling pornography
A dozen of the 31 accounts quickly became dominated by a particular theme, as the selective algorithm tailored the for you page to previously expressed interests
With about 100 million users in the US alone, moderators are struggling to keep pace, so have to focus on the most popular content, leaving lower view count videos unchecked.
The firm is slowly rolling out a range of security features, including giving 16 and 17 year olds the ability to determine who can see their posts.
TikTok said the policies over content evolves in response to industry standards, as well as changing user behaviour.
It said that as the users on the platform get older and more diverse, the content shown, as well as policies over acceptable use will change and adapt.
‘While the activity and resulting experience of these bots in no way represents the behaviour and viewing experience of a real person, we continually work to improve our systems and we’re reviewing how to help prevent even highly unusual viewing habits from creating negative cycles, particularly for our younger users,’ a TikTok spokesperson told MailOnline.
‘We care deeply about the safety of minors, which is why we build youth well-being into our policies, limit features by age, empower parents with tools and resources, and continue to invest in new ways to enjoy content based on age-appropriateness or family comfort.’
Source: