Apple Inc. said it will launch new software later this year that will analyze photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.
As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit.
Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material.
The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.
How it will work?
The system works by comparing pictures to a database of known child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.
Those images are translated into “hashes”, numerical codes that can be “matched” to an image on an Apple device.
If Apple detects a threshold of sexually explicit photos of children in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC.
Privacy Concerns
Experts worry that the technology could be used by authoritarian governments to spy on its citizens and also it can be use to scan phones for prohibited content or even for political speech.
Highly accurate
“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple said.
The company claimed the system had an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”.
Apple says that it will manually review each report to confirm there is a match. It can then take steps to disable a user’s account and report to law enforcement.
The company says that the new technology offers “significant” privacy benefits over existing techniques – as Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.
Press the 🔔 icon for notifications of all new updates