Apple Unveils Software That Will Scan iPhones for Child Abuse Images and Messages


Last week, Apple announced that it has developed software to scan iPhones for child abuse images and report suspected users to authorities, giving concern to privacy advocates and civil libertarians who have many questions about the technology’s reach.

In a blog posted to its website on Thursday, Apple unveiled a suite of tools that would allow the company to scan iPhones and other devices for child abuse images and text messages with explicit content and report those users to law enforcement.

The software, scheduled for release as part of iOS 15, uses a matching system to scan images on iPhones and images being uploaded to Apple’s online iCloud against known child pornography. Though a similar type of matching technology has been used by companies such as Facebook for years, images are scanned only after they are uploaded to those companies’ servers. Apple’s new system would scan images and messages on users’ devices – what’s known as client-side surveillance.

Apples said there is a “one-in-a-trillion” chance of a person being incorrectly flagged and that each matter would be manually reviewed before accounts are close and law enforcement notified.

Despite reassurances from Apple, the aggressive move against child predators is raising concerns from privacy advocates. Though they support efforts against child abuse, they also express concerns about bulk surveillance of personal devices and potential misuse by the government.

At KBN, our attorneys know that new and evolving technologies can add additional layers of complexity in criminal cases, especially when used by law enforcement in cases involving sex crimes and computer sex crimes. Our firm is keeping track of the latest developments involving both the law and Apple’s new technology.