Social media companies are being urged to introduce stronger measures to prevent children under the age of 13 from accessing their platforms in the UK. Regulators say current safeguards are not sufficient and that technology firms must take greater responsibility for ensuring younger users cannot easily sign up to services intended for older audiences.
Major platforms including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X have been contacted by communications regulator Ofcom and the Information Commissioner’s Office (ICO). Both organisations say many platforms still rely heavily on users entering their own age during sign-up, a system that can easily be bypassed by children.
Research suggests a large number of children aged between 10 and 12 already have social media accounts despite most platforms setting a minimum age of 13. Regulators want companies to explore stronger verification methods similar to those used by some online services that restrict access to adult content. The ICO has also warned that allowing under-age children to register could raise concerns about how their personal data is collected and processed.
Technology firms say they already have safeguards in place. Some companies claim they are using artificial intelligence and facial age-estimation tools to identify younger users, while others are testing new verification systems. Experts, however, argue that stricter age checks are only one part of the solution and that platforms must also design their services with children’s safety as a priority.


