November 30, 2021

Apple delays plan to scan iPhones for child abuse images – CNET

116-iphone-12-purple-2021
Sarah Tew/CNET

Apple on Friday delayed a set of features designed to protect children from sexual predators on some of its iPhones, iPads and Mac computers. The move follows criticism from privacy advocates and security researchers who worried the company’s technology could be twisted into a tool for surveillance.

In a statement, Apple said it would delay its new tools to identify images of child abuse on its devices as well as features to warn children about sexualized messages sent by SMS or iMessage. Apple had announced the tools last month.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” a company spokesman said. The company didn’t respond to a request for further comment about when it plans to reintroduce these technologies.

Now playing: Watch this: Apple pushes back on child abuse scanning concerns, NASA…

1:41

It was a surprise reversal by Apple, which has argued for weeks that its new features were built in thoughtful and privacy-protecting ways. Apple for years has promised its devices and software are designed with privacy in mind, representing an alternative to devices built with Google’s Android software.  The company even dramatized that with an ad it hung just outside the convention hall of the 2019 Consumer Electronics Show, which said, “What happens on your iPhone stays on your iPhone.”

“We at Apple believe privacy is a fundamental human right,” Apple CEO Tim Cook has often said.

Still, that didn’t calm policy and advocacy groups, nearly 100 of which signed an open letter to Apple asking it to reconsider implementing its technology shortly after it was announced.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the group said in the letter, whose signatories include the Center for Democracy and Technology, the American Civil Liberties Union, the Electronic Frontier Foundation and Privacy International.

The technology encountered resistance in part because it warns parents and children when they might be sending or receiving a sexually explicit photo using its Messages app. Apple received complaints from policy and rights groups who said that the software could have “disastrous consequences for many children.” Privacy experts, who agree that fighting child exploitation is a good thing, worried that Apple’s technology moves might open the door to wider uses that could, for example, put political dissidents and other innocent people in harm’s way.

“Apple’s plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history,” wrote Evan Greer, director of the advocacy group Fight for the Future, in a statement. “It’s encouraging that the backlash has forced Apple to delay this reckless and dangerous surveillance plan, but the reality is that there is no safe way to do what they are proposing. Apple’s current proposal will make vulnerable children less safe, not more safe. They should shelve it permanently.”

Proponents of Apple’s plans were disappointed by the company’s pause on the technology’s rollout, arguing that the tech giant had taken a thoughtful approach to a tough problem.