Published by The Lawfare Institute
in Cooperation With
It is a truism that bad news drops on Fridays, timed to fly under the radar. The Friday before Labor Day is especially auspicious for hiding problems By that standard, Apple yesterday seems to have acknowledged a rather large public relations error.
Late on Friday, Apple stated that it would postpone its plans to deploy a system that scanned images on iPhones for child sexual abuse material (CSAM). This client-side scanning (CSS) system was first publicly announced in early August.
To review the bidding, the idea was that a new program, called NeuralHash would be included in iOS 15 and macOS Monterey releases which were due out in a few months. That program (which would not have been optional) would convert the photographs uploaded from a user’s iPhone or Mac to a unique hash value. Those hashes in turn would be matched with a database of known hashes of CSAM that is provided to Apple by organizations like the National Center for Missing & Exploited Children (NCMEC). A matching alert for any positive match would be sent to Apple, which would review the alleged match by hand only after 30 alleged matches and, if apt, report the match to NCMEC and thence to law enforcement authorities.
While the goal was laudable, to be sure, many privacy advocates were concerned about its intrusiveness and the mandatory nature of the system. Many, including me, wrote reviews of the proposal that ranged from highly critical to cautiously doubtful.
It now appears that the degree of controversy was too great for Apple to withstand and that it wanted to go back to the drawing board. In its statement announcing the pause Apple said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” It remains to be seen exactly what comes next, but one thing is for sure—this story isn’t over by a long-shot.