• Home
  • News
  • World
  • Politics
  • Money
  • Lifestyle
  • Entertainment
  • Health
  • Sports
  • Travel
  • More
    • Tech
    • Web Stories
    • Spotlight
    • Press Release
What's On

Martin Lewis reveals ‘medical issue’ as he is unable to donate to NHS service

14 November 2025

First look inside M&S new flagship store with ‘biggest ever’ beauty section

14 November 2025

Myles Lewis-Skelly makes feelings clear on lack of Arsenal problem amid England disagreement

14 November 2025

Destination with November sea warm enough to swim in that’s budget-friendly

14 November 2025

Ozzy Osbourne’s secret daughter’ sends Sharon her ‘hair, blood and nails’ for DNA test

14 November 2025

Blackbirds will keep visiting garden if you leave 1 fruit outside in November

14 November 2025

Gary Lineker ‘in talks’ with BBC rival over new job after controversial exit

14 November 2025
Facebook X (Twitter) Instagram
Trending
  • Martin Lewis reveals ‘medical issue’ as he is unable to donate to NHS service
  • First look inside M&S new flagship store with ‘biggest ever’ beauty section
  • Myles Lewis-Skelly makes feelings clear on lack of Arsenal problem amid England disagreement
  • Destination with November sea warm enough to swim in that’s budget-friendly
  • Ozzy Osbourne’s secret daughter’ sends Sharon her ‘hair, blood and nails’ for DNA test
  • Blackbirds will keep visiting garden if you leave 1 fruit outside in November
  • Gary Lineker ‘in talks’ with BBC rival over new job after controversial exit
  • Brit went for a hair transplant and ‘Turkey teeth’ – within days he was dead
Facebook X (Twitter) Instagram YouTube
England TimesEngland Times
Demo
  • Home
  • News
  • World
  • Politics
  • Money
  • Lifestyle
  • Entertainment
  • Health
  • Sports
  • Travel
  • More
    • Tech
    • Web Stories
    • Spotlight
    • Press Release
England TimesEngland Times
Home » Evil predators who make sexual deepfakes of two-year-olds to face new law
Politics

Evil predators who make sexual deepfakes of two-year-olds to face new law

By staff12 November 2025No Comments4 Mins Read

Under new legislation, AI developers and child protection organisations will be able to test artificial intelligence (AI) models to prevent the creation of indecent images

Evil predators who make sexual deepfakes of children including infants under two years old will face a fresh crackdown under the law.

Under new legislation, AI developers and child protection organisations will be able to test artificial intelligence (AI) models to prevent the creation of indecent images.

Under the current UK law – which criminalises the possession and generation of child sexual abuse material – developers cannot carry out safety testing on AI models, meaning images can only be removed after they have been created and shared online.

Reports of AI-generated child sexual abuse material have more than doubled in the past year, rising from 199 in 2024 to 426 in 2025, according to data published by the Internet Watch Foundation (IWF) today. There has also been a disturbing rise in depictions of infants, with images of 0–2-year-olds surging from five in 2024 to 92 in 2025.

READ MORE: Dating apps could be blocked if they fail to tackle cyber flashing under major law change

IWF said the severity of the material has also intensified, with Category A content – images involving penetrative sexual activity, sexual activity with an animal, or sadism – rising from 2,621 to 3,086 items. Girls have been overwhelmingly targeted, making up 94% of illegal AI images in 2025.

In what is being described as one of the first of its kind in the world, the change to the law will ensure AI systems’ safeguards can be “robustly tested from the start”, the Department for Science, Innovation and Technology (DSIT) said. It will also enable organisations to check models have protections against extreme pornography and non-consensual intimate images.

The changes will be tabled today as an amendment to the Crime and Policing Bill. The Government said it will bring together a group of experts in AI and child safety to ensure testing is “carried out safely and securely”.

The NSPCC said the new law must make it compulsory for AI models to be tested in this way.

Be the first with news from Mirror Politics

BLUESKY: Follow our Mirror Politics account on Bluesky here. And follow our Mirror Politics team here – Lizzy Buchan, Mikey Smith, Ashley Cowburn, Alexander Brown, Sophie Huskisson and Dave Burke.

POLITICS WHATSAPP: Be first to get the biggest bombshells and breaking news by joining our Politics WhatsApp group here. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you want to leave our community, you can check out any time you like. If you’re curious, you can read our Privacy Notice.

NEWSLETTER: Or sign up here to the Mirror’s Politics newsletter for all the best exclusives and opinions straight to your inbox.

PARTY GAMES: Watch our new YouTube series ‘Party Games’ where we play games with MPs, hosted by the Mirror’s Sophie Huskisson

Rani Govender, policy manager for child safety online at the charity, said: “To make a real difference for children, this cannot be optional. Government must ensure that there is a mandatory duty for AI developers to use this provision so that safeguarding against child sexual abuse is an essential part of product design.”

Kerry Smith, chief executive of the IWF, said: “AI tools have made it so survivors can be victimised all over again with just a few clicks, giving criminals the ability to make potentially limitless amounts of sophisticated, photorealistic child sexual abuse material. Safety needs to be baked into new technology by design. Today’s announcement could be a vital step to make sure AI products are safe before they are released.”

Technology Secretary Liz Kendall said: “We will not allow technological advancement to outpace our ability to keep children safe.

“These new laws will ensure AI systems can be made safe at the source, preventing vulnerabilities that could put children at risk. By empowering trusted organisations to scrutinise their AI models, we are ensuring child safety is designed into AI systems, not bolted on as an afterthought.”

Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email

Related News

Rachel Reeves rips up major Budget plan to raise income tax in bombshell twist

14 November 2025

Reasons for prisoners committing more crime revealed in major report

14 November 2025

Reform MP accused of ‘complete fabrication’ over boxing club funding boast

14 November 2025

Millions of renters to get new rights in just six months – see how it impacts you

14 November 2025

BBC Question Time audience laugh at guest’s cheeky six-word dig about taxes

14 November 2025

Raise property taxes on rich by £3.9 billion to slash council tax for majority, ministers told

14 November 2025
Latest News

First look inside M&S new flagship store with ‘biggest ever’ beauty section

14 November 2025

Myles Lewis-Skelly makes feelings clear on lack of Arsenal problem amid England disagreement

14 November 2025

Destination with November sea warm enough to swim in that’s budget-friendly

14 November 2025

Ozzy Osbourne’s secret daughter’ sends Sharon her ‘hair, blood and nails’ for DNA test

14 November 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Don't Miss
Lifestyle

Blackbirds will keep visiting garden if you leave 1 fruit outside in November

By staff14 November 20250

Blackbirds are a common sight in gardens across the UK, and there’s a simple way…

Gary Lineker ‘in talks’ with BBC rival over new job after controversial exit

14 November 2025

Brit went for a hair transplant and ‘Turkey teeth’ – within days he was dead

14 November 2025

Act fast to protect pensions and savings after Rachel Reeves’ tax U-turn

14 November 2025
England Times
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 England Times. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.

Go to mobile version