Social media giants have been sent a final list of measures they must put in place to protect kids online by July – or risk being fined 10% of their global turnover
Social media giants have been sent a final list of measures they must put in place to protect kids online by July – or risk being fined 10% of their global turnover.
A legal responsibility for online services to make their sites safe for children will finally come into force in the summer after years of dithering by media regulator Ofcom. Technology Secretary Peter Kyle described today as a “watershed moment” in turning the tide on “toxic experiences” on social media.
Under the Online Safety Act, social media firms will be ordered to ensure they tame toxic algorithms, take faster action on removing harmful content and introduce proper age checks on their platforms.
Tech companies will now be expected to begin assessing the risk of harm to children on their platforms after Ofcom’s final children’s safety codes were published today. From July these protections will be fully enforceable and services that don’t comply could face serious enforcement action from Ofcom.
READ MORE: Social media giants given ‘final deadline’ to stop kids accessing harmful content
This could include fines of £18million or up to 10% of their global revenue or impose other business disruption measures, such as requiring payment providers or advertising services to withdraw from an online site.
Under the legally-binding guidance, firms must protect children from content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, as well as pornography, bullying and violent material. This includes instructing platforms on how they can reduce toxic algorithms which are known to recommend harmful content to children without them seeking it out.
Steps platforms can take will vary depending on the risk of harmful content, but includes introducing age checks like photo ID matching, facial age estimation or credit card checks and filtering out harmful content from algorithms. Social media sites will also need to ensure they have more robust content moderation systems to take swift action against harmful content when they become aware of it.
READ MORE: Social media giants given ‘final deadline’ to stop kids accessing harmful content
But it comes amid fears Donald Trump’s return to the White House marks a new era in the age of tech giants. The US President – who is close pals with X/Twitter owner Elon Musk – has voiced his fierce support for Silicon Valley, especially in defence of free speech and less moderation. Meta, which owns Facebook and Instagram, abandoned its use of independent fact checkers in the US earlier this year after Mr Trump criticised them.
Concerns have been raised that the Government, which is scrambling to maintain good relations with the Trump administration, could pander to the US on tech issues.
While the Online Safety Act became law in October 2023, Ofcom has not started using its powers yet as it has been undertaking painstakingly long consultations on its new guidance. In January, Tech chief Mr Kyle admitted the laws are “very uneven [and] unsatisfactory”. He said MPs need to get into a better cycle of “updating” current laws due to the extremely fast pace technology develops.
On the publication of Ofcom’s codes this morning, Mr Kyle said: “Growing up in the digital age should mean children can reap the immense benefits of the online world safely but in recent years too many young people have been exposed to lawless, poisonous environments online which we know can lead to real and sometimes fatal consequences. This cannot continue.
“The Children’s Safety codes should be a watershed moment – turning the tide on toxic experiences on these platforms – with the largest social media companies now having to prioritise children’s safety by law. This means age checks to stop children being exposed to the most extreme harmful content, as well as changes to platform design including algorithms to stop young users being served up harmful content they often aren’t even seeking.
“Like parents across the country I expect to see these laws help create a safer online world, so we set every child up for the best start in life. But we won’t hesitate to go further to protect our children; they are the foundation not the limit when it comes to children’s safety online.”
READ MORE: Join our Mirror politics WhatsApp group to get the latest updates from Westminster