Apple is temporarily hitting the pause button on its controversial plans[1]
to screen users’ devices for child sexual abuse material (CSAM)
after receiving sustained blowback over worries that the tool could
be weaponized for mass surveillance and erode the privacy of
users.
“Based on feedback from customers, advocacy groups, researchers,
and others, we have decided to take additional time over the coming
months to collect input and make improvements before releasing
these critically important child safety features,” the iPhone maker
said[2]
in a statement on its website.
The changes were originally slated to go live with iOS 15 and
macOS Monterey later this year.
In August, Apple detailed several new features intended to help
limit the spread of CSAM on its platform, including scanning users’
iCloud Photos libraries for illicit content, Communication Safety
in Messages app to warn children and their parents when receiving
or sending sexually explicit photos, and expanded guidance in Siri
and Search when users try to perform searches for CSAM-related
topics.
The so-called NeuralHash technology would have worked by
matching photos on users’ iPhones, iPads, and Macs just before they
are uploaded to iCloud Photos against a database of known child
sexual abuse imagery maintained by the National Center for Missing
and Exploited Children (NCMEC) without having to possess the images
or glean their contents. iCloud accounts that crossed a set
threshold of 30 matching hashes would then be manually reviewed,
have their profiles disabled, and reported to law enforcement.
The measures aimed to strike a compromise between protecting
customers’ privacy and meeting growing demands from government
agencies in investigations pertaining to terrorism and child
pornography — and by extension, offer a solution to the so-called
“going dark[3]” problem of criminals
taking advantage of encryption protections to cloak their
contraband activities.
However, the proposals were met with near-instantaneous
backlash, with the Electronic Frontier Foundation (EFF) calling out
the tech giant for attempting to create an on-device surveillance
system, adding “a thoroughly documented, carefully thought-out, and
narrowly-scoped backdoor is still a backdoor.”
But in an email[4]
circulated internally at Apple, child safety campaigners were found
dismissing the complaints of privacy activists and security
researchers as the “screeching voice of the minority.”
Apple has since stepped in to assuage potential concerns arising
out of unintended consequences, pushing back against the
possibility that the system could be used to detect other forms of
photos at the request of authoritarian governments. “Let us be
clear, this technology is limited to detecting CSAM stored in
iCloud and we will not accede to any government’s request to expand
it,” the company said.
Still, it did nothing to allay fears that the client-side
scanning could amount to troubling invasions of privacy and that it
could be expanded to further abuses, and provide a blueprint for
breaking end-to-end encryption. It also didn’t help that
researchers were able to create “hash collisions[5]” — aka false positives —
by reverse-engineering the algorithm, leading to a scenario where
two completely different images generated the same hash value, thus
effectively tricking the system into thinking the images were the
same when they’re not.
“My suggestions to Apple: (1) talk to the technical and policy
communities before you do whatever you’re going to do. Talk to the
general public as well. This isn’t a fancy new Touch Bar: it’s a
privacy compromise that affects 1 billion users,” Johns Hopkins
professor and security researcher Matthew D. Green tweeted[6].
“Be clear about why you’re scanning and what you’re scanning.
Going from scanning nothing (but email attachments) to scanning
everyone’s private photo library was an enormous delta. You need to
justify escalations like this,” Green added.
References
- ^
controversial plans
(thehackernews.com) - ^
said
(www.apple.com) - ^
going
dark (www.fbi.gov) - ^
email
(twitter.com) - ^
hash
collisions (www.theverge.com) - ^
tweeted
(twitter.com)