Why is Apple Receiving Major Backlash for its New Child Abuse Prevention Software?
In a now infamous tweet, Will Cathcart, head of Whatsapp, voiced his concern over Apple’s announcement to implement scanning software intended to detect child abuse and exploitation.
Cathcart made his opinion clear, stating:
“I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world.”
While he goes on to condemn child pornography and to voice his support for prevention measures, Cathcart believes that Apple is simply going about things the wrong way — so we decided to take a closer look at this software, and what it means for the future of digital privacy…
What does the scanning software do?
In addition to identifying and blocking inappropriate content sent to children’s iCloud accounts, the software (dubbed NeuralHash) has the power to scan photos on a user’s device for specific data markers that alert algorithms to images that may contain child sexual abuse materials (CSAM).
These data markers are called hashes, and the hashes used for Apple’s software were provided directly by the National Center for Missing and Exploited Children.
If an image is flagged, that users account is immediately suspended, and a live person is alerted to confirm the content. Content that contains CSAM can then be reported to authorities.
While Apple claims that the chances of false flags are remote, there is an appeal process in place for users who feel that they’ve been mistakenly identified through the system.
So, why is this upsetting some digital privacy experts?
The condemnation and outrage surrounding child predators is a universal sentiment.
However, some digital privacy experts are concerned that this type of software opens the door for government entities to gain direct access to data stored on personal devices.
It’s not necessarily the NeuralHash purpose that has people on edge — but its process.
NeuralHash is an incredible algorithm that could be replicated and twisted to invade personal privacy on a larger scale. It’s also possible that the software could be turned against Apple users, making it appear that every account contains CSAM and shutting users out.
It also has the potential to set dangerous precedents for what is considered “inappropriate” in the eyes of institutions and corporations in the future.
Apple has denied the possibility of NeuralHash creating a backdoor for the government to gain additional access to our data and goes on to cite the use of real people for CSAM confirmation as a deterrent for false flagging concerns.
The scanning software will roll out to U.S. consumers within the next few months — and only time will tell what the long-term impact looks like.
Join our community!
Just follow us on Medium, send us some love on Facebook, and find us on Twitter under the username: @keepitcloaked
If you’d like to sign up to participate in a future beta testing cohort, we’d love to have you! Just click here to join our wait and mailing list and let the fun begin — no joke, we really are fun!