Apple on Thursday unveiled changes to iPhones designed to catch nude photos that are of children or are sent to children, a move that will likely please parents and police but that was already worrying to privacy watchdogs.
Apple said iPhones will begin later this year using complex technology to spot images of child sexual abuse, commonly known as child pornography, that users upload to Apple’s cloud-storage service, called iCloud. Apple also said it would soon enable parents to turn on a feature that can flag when their children send or receive any nude photos in a text message.
Apple said it had designed the new features in a way that protected the privacy of users, including by ensuring that Apple will never see or find out about any nude images exchanged in a child’s text messages. The scanning is done on the child’s device, and the notifications are only sent to parents’ devices. Apple provided quotes from some cybersecurity experts and child-safety groups who praised Apple’s approach.
But other cybersecurity experts were still concerned. Matthew D. Green, a cryptography professor at Johns Hopkins University, said Apple’s new features set a dangerous precedent by creating surveillance technology that could be exploited by law enforcement or governments.
“They’ve been selling privacy to the world and making people trust their devices,” Mr. Green said. “But now they’re basically capitulating to the worst possible demands of every government. I don’t see how they’re going to say no from here on out.”
The mixed reviews of Apple’s new features show the thin line that technology companies must walk between aiding public safety and ensuring customer privacy. Law enforcement officials for years have complained that technologies like smartphone encryption have hamstrung criminal investigations, while tech executives and cybersecurity experts have argued that such encryption is crucial to protect people’s data and privacy.
This story will be updated.
Michael H. Keller and Gabriel J.X. Dance contributed reporting.