Apple to Scan iPhones for Child Sex Abuse Images

apple

Before you store an image on your iCloud Photos, Apple will search for specific matches called child sex abuse images or CSAM.

Apple says in the recent update that, if the company finds a match of a human, the reviewer will immediately assess and then report that user to relevant authorities.

However, there will some particular privacy concerns that the technology could be expanded to scan phones for restricted content as well as hateful political speech.

All the experts and common citizens are worried that this technology could now be used by government authorities in order to spy on people.

Furthermore, Apple also said that the new versions of iOS and iPadOS will be released later this year and they will also have “new applications of cryptography would efficiently limit the spread of child sex abuse images while maintaining user’s privacy.”

That system will work by comparing all the photos stored in the database of known CSAM and compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.

The images will be effectively translated into several “hashes” and numerical codes that could be “matched” to some photos stored on an iOS device.

The experts also revealed that the technology may catch some edited but similar versions of the same original photos.

The on-device matching process with privacy benefits:

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple said.

Not only that, but Apple professionals have also revealed that “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”.

According to experts, they will manually check every report so they can better confirm if they find a match. After that, they may take further actions to disable that user’s account as then immediately reports it to law enforcement agencies.

Apple also says that this new technology will provide “essential” privacy benefits over current technologies in the world. Because Apple will only learn about the specific user’s photos or if they already have a collection of CSAM in their iCloud account.

After that announcement, some privacy experts have stated concerns as well.

“Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Matthew Green, a security researcher at Johns Hopkins University, said.

“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.

Numbers are already big:

Numbers are already big. Yes, there are 20 million child sexual abuse images that are reported by Facebook to several law enforcement firms in 2020. The facts were highlighted in a report presented by the National Council for Missing and Exploited Children.

This report has included numbers from both platforms Instagram and Facebook. Also, the number has risen from 16 million in 2019.

0

Related Posts

Leave a comment

You must be logged in to post a comment.