Tech firms could soon be legally required to remove intimate images shared without permission within 48 hours, under new proposals announced by the government. Ministers say intimate image abuse should be treated with the same seriousness as child sexual abuse material and terrorist content, with tough penalties for companies that fail to act.
The measures, being introduced through an amendment to the Crime and Policing Bill currently in the House of Lords, would allow regulators to fine platforms up to 10% of global turnover. In the most severe cases, services could also be blocked in the UK. Prime Minister Sir Keir Starmer said victims should not be forced into a “whack-a-mole” battle, reporting the same image repeatedly as it spreads across different sites.
Under the plans, someone affected would only need to report the content once, rather than contacting multiple platforms separately. Companies would also have a duty to prevent removed images from being uploaded again. Guidance is expected for internet service providers on blocking access to sites hosting illegal content, aimed at tackling rogue websites beyond the current reach of the Online Safety Act.
Campaigners welcomed the shift in responsibility. The End Violence Against Women Coalition said it correctly places the burden on technology companies to protect victims. Government and parliamentary reports have shown intimate image abuse disproportionately affects women, girls and LGBT people, while young men and boys are often targeted through financial “sextortion”. The announcement follows recent action over AI-generated sexualised images and a new UK law banning non-consensual deepfake pornography.


