The new feature lands later this year and has already provoked controversy with security and privacy experts to warn for potential threats
Apple plans to run a new system on iPhones to scan photos to spot child abuse images. The new feature will be installed later this year and in the first stage, it will be tested only in the US.
The company introduced the new system, called “Neural Match”, to US academics, and later through a press conference, Apple also confirmed the new feature. As the company explained at the press conference, the new service will constantly scan photos stored on users’ iPhones and iCloud and convert them into a “digit sequence” of numbers. Those numbers will be compared to a database of known child sexual abuse photos. When a child abuse photo is detected, it will be blurred automatically and the children’s parents will be notified.
In a post on its website outlining the updates, the company said that its goal is “to create technology that empowers people and enriches their lives — while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material”. Also, Apple emphasized that the new feature has been developed in collaboration with child safety experts and that safety measures are taken to protect users’ data and photos.
The company also announced the new communication tool that will automatically warn users when they are about to receive or send an explicit image via messages. This adds to Apple’s effort to create a safe online space for children. “The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos” the company explained. “Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages”.
Respected security experts and privacy advocates have expressed their concerns about the new Apple’s system, warning that it could lead to government surveillance of phones and laptops.
Former NSA adviser Edward Snowden is one of those who accuse Apple of saying that no matter how well-intentioned the company might be, it is practically bringing mass surveillance around the world. “No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs, without asking.” Snowden wrote on Twitter.
Ross Anderson, professor of engineering at the University of Cambridge described the idea behind this new feature as “absolutely appalling” to the Financial Times, warning “it is going to lead to distributed bulk surveillance of our phones and laptops”.
In a series of tweets, Matthew Green, a cryptography professor at Johns Hopkins University, explained Apple’s new feature. “Initially I understand this will be used to perform client-side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems” he pointed out. “The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over” he added.
Following Professor Green’s thought, Alan Woodward, a computing expert at the University of Surrey, also tweeted that the new feature could be a “double-edged sword: the road to hell is paved with good intentions” and asked for more public discussion before it was launched.
On the other side, John Clark, the president and CEO of NCMEC, estimated that Apple’s new tool is in the right direction and can help in the fight against child abuse. “With so many people using Apple products, these new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” he said. “At the National Center for Missing & Exploited Children, we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known”.
The former Home Secretary Sajid Javid tweeted that he is “delighted to see Apple taking meaningful action to tackle child sexual abuse, and adopting one of the ideas set out by my commission this year”. He also underlined that “it is time for others – especially Facebook – to follow their example”.
Apple said its new feature will be launched in the US at first, but the company didn’t clarify if or when it would be available internationally.