One of those concerns that people have over the feature is that governments might soon ask Apple to start scanning for other types of imagery that aren’t CSAM. Some people have voiced out that this tool has the potential to scan for other images like political opponents, certain ethic or minority groups that face persecution in their countries, and so on.
According to Apple, “Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”
The company adds, “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.”
Of course, it remains to be seen how long Apple will hold out on these claims, but for now, hopefully it will assuage some of your concerns.
. Read more about