Children have been bombarded with upsetting content on social media including the death of Liam Payne and the murder of Charlie Kirk, research shows today
Children have been bombarded with upsetting content on social media including the death of Liam Payne and the murder of Charlie Kirk, a study shows.
Research by Internet Matters found six in 10 kids who consume news on social media saw a story that worries or upsets them in June, including content relating to war, conflict, violence, death and crisis events.
The non-profit organisation said algorithms have been flooding children’s feeds with graphic content they don’t want to see.
It also raised concerns about AI-generated content, with more than a quarter (27%) of kids saying they had believed a fake story.
The proportion is greater among vulnerable children – which for this report included those who have qualified for special educational needs (SEN) support or have an education, health and care plan (EHCP) or those with a physical or mental health condition.
Around four in 10 (43%) of this group believed a fake or AI-generated story, compared with 23% of children not deemed as vulnerable for this research.
READ MORE: TikTok job cuts should be probed by MPs as alarm raised over online safety
One 17-year-old girl told researchers: “Today I’ve seen about three videos of natural disasters like hurricanes and floods and they’ve all been fake but I believed every single one of them, the AI fooled me.”
The report warned that the spread of misinformation online “can deepen social and political divides and even trigger real-world harm” such as the riots in the aftermath of the Southport murders.
Researchers added: “The spread of AI-generated content, including deepfakes, on social media is also increasing the risk of mis- and disinformation and making it more difficult for users to verify and trust news.”
Almost half (48%) of children said they feel social media companies should take proactive steps to remove fake news, while 40% said AI-generated content should be clearly flagged or labelled.
According to polling, 76% of children and young people consume news weekly, with 68% of those getting their news from social media.
Algorithms were found to play a significant role in showing youngsters news in their recommender feeds, with 40% of children who get their news from social media not following news-focused accounts.
The report called on social media companies to embed media literacy into their platform designs, including features to actively help children evaluate, question and contextualise the information they see.
Rachel Huggins, Internet Matters co-chief executive, said: “While social media can offer immediate access to news which keeps children and young people informed and connected with the world around them, the volume of information, which is often negative, poses a risk to their wellbeing.”
Jess Asato, Labour MP for Lowestoft and a member of the Education Select Committee, said: “Too often, children are exposed to harmful or misleading content online with little support to make sense of it.
“While the Online Safety Act will help to make platforms more accountable, we also need to ensure every child has the skills to navigate the fast-changing digital world safely and critically.”
Internet Matters surveyed 1,000 UK children aged 11-17 in July. Ofcom’s children’s codes came into force at the end of July and are expected to tame toxic algorithms.
A government spokesman said: “This research pre-dates enforcement of new child safety requirements. We now expect young people to be protected from harmful content, including violent material, and illegal mis- and disinformation, as platforms comply with the legal requirements of the Act. That means safer algorithms and less toxic feeds.
“We’re also working to help families build stronger online safety skills so parents and children can make informed choices online. We will not hesitate to act where evidence shows further intervention is needed to protect children.”
READ MORE: Join our Mirror politics WhatsApp group to get the latest updates from Westminster