August, 2021

Apple iPhone Surveillance

Recent news that Apple plans to scan Apple devices for images of Child Sexually Abusive Material (CSAM) has caused concern among some people, but the program will not be a widespread search through either the iCloud or the users’ phones. The AI system, called “neuralMatch,” involves creation of a “hash,” or a “computed value” of a file, of images on a user’s device. The hash is then compared to hashes in a database of known CSAM, and, if matches are found, the user’s iCloud can be shut down and the CSAM reported to the National Center for Missing and Exploited Children. The technology also scans images in the iMessage system sent or received by children under 13; if CSAM is found, then the child is alerted that parents will be notified if the child chooses to view the material. For children between 13 and 18, an alert is sent to the child, but apparently there is no parental notification as with the younger children.

Some critics also have voiced concern about a “slippery slope” of removing Apple’s well-known privacy protections; however, Apple stated there are multiple levels of encryption in the process and user’s private communications remain private. Also, the system only works with “CSAM image hashes provided by NCMEC and other child safety organizations.” Critics also fear that Apple’s system can generate false positives; however, Apple stated there is only a “one in one trillion chance per year” of a false match. If a user thinks they have been wrongly flagged as having CSAM, the user can file an appeal, Apple said.

Sources:  Lucas Ropek, “Apple Says It Won't Let the Government Turn Its Child Abuse Detection Tools Into a Surveillance Weapon,”, August 9, 2021:
Thomas Reed, “Apple’s search for child abuse imagery raises serious privacy questions,”, August 6, 2021:
Apple blog post:
Apple FAQ, August, 2021:

Facial Recognition in Retail Stores

A recent article highlighted that some well-known companies are using biometric, e.g., facial-recognition, software to protect assets, boost sales, and track employees. Customers and employees frequently are not aware that their personal data is being tracked and recorded or that they may be listed on a watch list. A new website, created by the non-profit organization Fight for the Future, keeps track of companies that are either using, will not use, or might use the technology. The site indicates that four companies – Apple stores, Macy’s, Albertsons, and H.E.B. Grocery – are currently using facial recognition software in stores. Nineteen companies, including Walmart, Kroger, Target, Lowe’s, Home Depot, Costco, and Starbucks, have indicated that the technology will not be used. Biometric tracking is largely unregulated, but some states and municipalities are creating protections for consumers. For example, in Illinois, which has a biometric protection law, class action lawsuits have been filed against businesses, especially following a 2019 decision by that state’s Supreme Court that plaintiffs suing businesses need not show that the collection of biometric data caused actual harm. In New York, a new law requires businesses to inform customers when biometric data is collected.

Critics of biometric scanning claim that the algorithms used in the software can lead to inaccurate and racially and gender-biased results.

Sources:  Rebecca Heilweil, “From Macy’s to Albertsons, facial recognition is already everywhere,”, July 21, 2021:
Fight for the Future, retail facial recognition database:

by Neil Leithauser
Associate Editor