Your iPhones are private and secure, at least that’s what Apple is telling you, and for the most part it’s true. However, according to a tweet by Matthew Green who teaches cryptography at John Hopkins, Apple could soon introduce a photo hashing feature meant to identify photos of child abuse.
But how does this work? Is Apple somehow “spying” on our phones? In this instance, no. Apple will apparently use client-side photo hashing, meaning that instead of uploading your photos and trying to match that against a database, Apple will download a set of fingerprints to your device and compare that to photos in your camera roll.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
If they don’t find anything, then nothing happens, but if they do, presumably it will be sent to a human moderator for further review. Apple hasn’t officially announced anything yet so it is unclear how exactly this will work, and more importantly, what are the implications of this and how users will feel about it.
Green points out some potential complications, such as if the government were to control these “fingerprints”. It could mean that in addition to trying to search for child abuse images, it could be potentially used to crack down on activism and politcal opponents.
As for those concerned about their privacy, 9to5Mac notes that photos uploaded and stored in iCloud Photos are actually not end-to-end encrypted to begin with. While they are stored in encrypted form on Apple’s servers, Apple holds the decryption keys which they would have to turn over to law enforcement if given a subpoena. Basically, while the intentions are good, it is a rather complicated system that needs to be figured out properly.
. Read more about