According to a legal statement made on Apple’s website, “Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain.”
The company adds, “As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review.” The company notes that accounts that contain such material will be disabled.
It is unclear if other companies perform similar scans to photos uploaded to their cloud, but this is something that Apple themselves have announced. While this is no doubt a good thing, it does highlight how items stored in the cloud might not be as private as you might think.
Filed in CES, CES 2020, Icloud, Legal, Safety and Social Hit. Source: telegraph
. Read more about