Apple and Google will soon be “encouraged” to build nudity-detection algorithms into their software by default, as part of the UK government’s strategy to tackle violence against women and girls, reports the Financial Times.

According to the report, Home Office officials want device operating systems to prevent any nudity from being displayed unless users can verify that they’re adults through biometric checks or official ID.
The proposal is said to target mobile devices initially, but it could extend to desktops. The government reportedly explored making the controls mandatory for devices sold in the UK, but it has apparently decided against that approach for now.
Apple currently offers Communication Safety tools that parents can activate and which detect nude photos and videos in apps like Messages, AirDrop, and FaceTime. However, teenagers can still view flagged images after dismissing an alert, while under-13s must enter a passcode.
Google also provides parental controls through its Family Link feature and includes “sensitive content warnings” in Google Messages. But neither company offers system-wide nudity blocking that extends to third-party apps like WhatsApp.
The proposal is sure to face objections from privacy and civil liberties groups, as well as questions about how effective any such measures would be. When the UK instituted age checks for porn websites earlier this year as part of the Online Safety Act, users got around restrictions using fake photos and VPN services.
The proposals are expected to be officially unveiled in the coming days, according to people familiar with the matter who spoke to FT.
This article, “UK Wants All iPhones to Block Explicit Images Unless You Prove Age” first appeared on MacRumors.com
Discuss this article in our forums
