Ian Russell, whose 14-year-old daughter Molly tragically took her own life in 2017, said it is ‘staggering’ that social media algorithms are targeting vulnerable youngsters
The dad of a teen who took her own life after being exposed to dangerous social media posts has voiced his horror that similar content is still available “on an industrial scale”.
Ian Russell said it was “staggering” as a new study accused TikTok and Instagram of actively putting young lives at risk. Campaigner Ian, whose 14-year-old daughter Molly died in 2017, called on Keir Starmer to beef up online safety laws.
Research by the Molly Rose Foundation – set up in her memory – found posts about depression, suicide and self harm were being recommended to accounts opened as a 15-year-old girl. The report claims teens who engage with suicide, self-harm and depression posts face being targeted with a “tsunami of harmful content”.
READ MORE: Labour minister pressed on dad Neil Kinnock’s benefit demand in awkward TV exchangeREAD MORE: Nigel Farage makes ‘British flag mistake’ on Reform football shirt – ’embarrassing’
It said 97% of Instagram reels and 96% of recommended TikTok videos for these youngsters were found to be harmful. Mr Russell said regulator Ofcom must take stronger action to prevent children accessing dangerous material.
He said: “It is staggering that eight years after Molly’s death incredibly harmful suicide, self-harm and depression content like she saw is still pervasive across social media. Ofcom’s recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users and ultimately do little to prevent more deaths like Molly’s.”
And in a message to the PM, he said: “For over a year, this entirely preventable harm has been happening on the Prime Minister’s watch and where Ofcom have been timid it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.””
The foundation’s research found 55% of recommended harmful posts on TikTok’s For You Page contained references to suicide and self-harm ideation, while 16% referenced suicide methods.
Its report said one in ten harmful videos on TikTok’s For You Page had been liked at least a million times. And on Instagram Reels one in five harmful recommended videos had been liked more than 250,000 times.
In 2022 a coroner found Molly’s death was “an act of self harm suffering from depression and the negative effects of online content”. Ian told an inquest that his daughter had found herself in “the bleakest of worlds” on Instagram and Pinterest.
Andy Burrows, chief executive of the Molly Rose Foundation, said: “Harmful algorithms continue to bombard teenagers with shocking levels of harmful content, and on the most popular platforms for young people this can happen at an industrial scale.
“It is shocking that in the two years since we last conducted this research the scale of harm has still not been properly addressed, and on TikTok the risks have actively got worse.”
It comes as a report by Children’s Commissioner Dame Rachel de Souza found the proportion of children saying they have seen pornography online has risen in the past two years.
Dame Rachel said her research shows that harmful content is reaching children through dangerous algorithms, rather than them seeking it out. She described the content young people are seeing as “violent, extreme and degrading” and often illegal, and said her office’s findings must be seen as a “snapshot of what rock bottom looks like”.
More than half (58%) of respondents to the survey said that, as children, they had seen pornography involving strangulation, while 44% reported seeing a depiction of rape. The report, based on responses from 1,020 people aged between 16 and 21 years old, found four in 10 respondents felt girls can be “persuaded” to have sex even if they say no at first.
Young people who had watched pornography were more likely to think this way, it said. Dame Rachel said: “This report must act as a line in the sand. The findings set out the extent to which the technology industry will need to change for their platforms to ever keep children safe.
“Take, for example, the vast number of children seeing pornography by accident. This tells us how much of the problem is about the design of platforms, algorithms and recommendation systems that put harmful content in front of children who never sought it out.”
TikTok and Meta, which owns Instagram, have been contacted for comment.
READ MORE: Join our Mirror politics WhatsApp group to get the latest updates from Westminster