Apple has announced a raft of new measures, aimed at keeping children safe on its platform and limiting the spread of child sexual abuse images.
As well as new safety tools in iMessage, Siri and Search, Apple is planning to scan users iCloud uploads for Child Sexual Abuse Material (CSAM). That’s sure to be controversial among privacy advocates, even if the ends can justify the means.
The company is planning on-device scanning of images that will take place before the photo is uploaded to the cloud. It’ll be checked against known ‘image hashes’ that can defect offending content. Apple says this will ensure the privacy of every day users will be protected.
Should the tech discover CSAM images, the iCloud account in question will be frozen and the images will be reported to the National Center for Missing and Exploited Children (NCMEC), which can then be referred to law enforcement agencies.
“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple writes in an explainer.
“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”
You might like…
Elsewhere, the new iMessage tools are designed to keep children safe from online exploitation. If a child receives what go-between image-detecting tech deems to be inappropriate it will be blurred and the child will be warned and “presented with helpful resources, and reassured it is okay if they do not want to view this photo.”
Depending on the parental settings, parents will be informed if the kid goes ahead and views the image. “Similar protections are available if a child attempts to send sexually explicit photos,” Apple says. “The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.” Again, on-device image detection tech is used.
Finally, new guidance in Siri and Search will provide iPhone and iPad owners with staying safe online and filing reports with the relevant authorities.
“Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”
These updates are coming in iOS/iPadOS 15.
The post Apple Child Safety update will scan photos for abusive material, warn parents appeared first on Trusted Reviews.