Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers’ devices.

Before saving the image to storage or iCloud Photos, the technology looks for known CSAM matches.

Apple said that if a match is found, the auditor will score the user and report it to law enforcement.

It can be extended to scan mobile phones to find banned content and even political speech.

Experts worry that authoritarian governments may use technology to monitor their citizens.

Apple said that the new versions of iOS and iPadOS that will be released later this year “will include new encrypted applications, which will help limit the online spread of CSAM while protecting user privacy.”

The system works by comparing images with a database of known child sexual abuse images compiled by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.

These images are translated into “hashes”, which are digital codes that can “match” the images on Apple devices. According to Apple, you can also use this technology to capture an edited but similar version of the original image.

Also see - Google Introduced new Tensor Chip-Specifications

‘High level of accuracy and Precision’

“Before saving an image stored in iCloud Photos or storage, a process runs on the device to match the image with a known CSAM hash value,” said Apple.

The company stated that the system “has extremely high accuracy and guarantees that the probability of the account being incorrectly flagged is less than one in a trillion per year.”

Apple said it will manually review each report to confirm a match, and then take steps to disable the user’s account and report it to the police.

The company said that this new technology provides a “significant” privacy advantage compared to existing technologies, because Apple only knows the user’s photos if there is a known CSAM collection in the user’s iCloud photo account.

However some privacy experts have voiced concerns.

“Regardless of Apple’s long-term plans, they have sent a very clear message. Johns Hopkins University researcher Matthew Green said that in their view (very influential), the development of a system to scan users’ mobile phones for prohibited The content is safe.-He said.

“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”

Jayesh Shewale
Journalism can never be silent: that is its greatest virtue and its greatest fault.

Related Articles


Please enter your comment!
Please enter your name here


Latest Articles