Apple on Thursday announced new features that will scan iPhone and iPad users’ photos to detect and report large collections of child sexual abuse images that are stored on their cloud servers.
“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” Apple said in a statement announcing the updates.
“This program is ambitious, and protecting children is an important responsibility. Our efforts will evolve and expand over time.”
Apple said the detection work happens on the device and insisted that only people who have large collections of images uploaded to iCloud that match a database of known child sexual abuse material would cross a threshold that enables some of their photos to be decrypted and analyzed by Apple.
Noting that possessing CSAM “is an illegal act,” Apple’s Chief Privacy Officer Erik Neuenschwander said on a press call Thursday that “the system was designed and trained” to detect images from the National Center for Missing and Exploited Children’s CSAM database.
Neuenschwander said people whose images are flagged to law enforcement would not be notified, but their accounts would be disabled.
“The fact that your account is disabled is a noticeable act to you,” Neuenschwander said.
Apple said their suite of mobile device operating systems will be updated with the new child safety feature “later this year.”
“Apple’s expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,” said John Clark, president & CEO, National Center for Missing & Exploited Children.
Some privacy experts expressed concern about the new system.
“The theory is that you will trust Apple to only include really bad images,” tweeted Matthew D. Green, an associate professor of cryptography at John Hopkins University. “Say, images curated by the National Center for Missing and Exploited Children (NCMEC). You’d better trust them, because trust is all you have.”
Also included in the update, Apple’s Messages app will use “on-device machine learning to analyze image attachments and determine if a photo is sexually explicit” for minors whose devices use parental control features.
The Messages app will intervene if the computer believes the minor may be sending or receiving an image or video showing “the private body parts that you cover with bathing suits,” according to slides provided by Apple.
“When a child receives this type of content the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo,” Apple said in a press release.
Informational screenshots show iPhone stopping kids from sending sensitive images with a warning that “sensitive photos and videos can be used to hurt you” and warns that the “person in this might not want it seen.”
“As an additional precaution the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent and the parents can receive a message if the child chooses to send it.”