This week’s memo read:
You may be getting questions about privacy from customers who saw our recent announcement about Expanded Protections for Children. There is a full FAQ here to address these questions and provide more clarity and transparency in the process. We’ve also added the main questions about CSAM detection below. Please take time to review the below and the full FAQ so you can help our customers.
The iCloud feature has been the most controversial among privacy advocates, some consumers, and even Apple employees. It assigns what is known as as a hash key to each of the user’s images and compares the keys with ones assigned to images within a database of existing explicit material. If a user is found to have such images in their library, Apple will be alerted, conduct a human review to verify the contents, and then report the user to law enforcement.
Apple initially declined to share how many potentially illicit images need to be in a user’s library before the company is alerted, sparking concern among some observers. On Friday, the company said that the initial threshold is about 30 pictures, but that the number could change over time.
The Cupertino, California-based company also addressed concerns about the potential of images not related to child pornography being inserted into the database for the purpose of some governments spying on users. In response, the company said its database would be made up of images sourced from multiple child-safety organizations, not just NCMEC, the National Center for Missing & Exploited Children, as was initially announced. It also said it will use data from groups in different regions and that the independent auditor will verify the contents of its database.
Apple previously said it would refuse any requests from governments to utilize its technology as a means to spy on users.
—Bloomberg News