News
Global Headlines
Apple to add features to iPhones, iPads to detect child sexual abuse content

Just Earth News | @justearthnews | 08 Aug 2021, 01:26 pm Print

Apple to add features to iPhones, iPads to detect child sexual abuse content Apple

Image Credit: Wallpaper Cave

Apple has said it will launch a software that will search for matches of known photos of child sexual abuse material (CSAM) on devices used in the US and then report such findings to relevant authorities.

The company also talked about adding a feature that will analyse photos sent and received in the Messages app to or from children to determine if a photo is sexually explicit.

The company is also setting up features in its Siri digital voice assistant that will intervene when people search for abusive material related to children on Apple phones. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

The three features have been previewed, the company said Thursday, and added that these updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

If Apple's system detects a match, it will be manually reviewed by a human and then the instances will be reported to the National Center for Missing and Exploited Children, or NCMEC,  which works with law enforcement agencies.

The new versions of iOS and iPadOS to be released later this year will have - "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy", Apple said.

Apple is using NeuralHash - a software for analysing images- that converts them to a hash key or unique set of numbers, which is then matched with the database using cryptography.

Apple said the process is not capable of learning about images that don’t match the database and user privacy is well protected as there is no chance of privacy breach as its system has an error rate of “less than one in 1 trillion” per year

Apple said  “Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account,” the company said in a statement. “Even in these cases, Apple only learns about images that match known CSAM.”

If a user feels that their account has been flagged by mistake, they can file an appeal, the company said.

Apple published a white paper explaining the technology and a third-party analysis of the protocol from various researchers, to answer the privacy concerns about the new feature.