Apple to scan iPhones for child sex abuse pics, experts see privacy concerns

Though Apple is saying that the privacy of the users will be the chief concern of the company but still, the experts are concerned about the same.

US-National-Center-for-Missing-and-Exploited-Children iCloud-Photos iCloud

With the increasing number of child abuse cases, the Apple Company is now using technology to scan the images that enter the iCloud Photos, this technology will search for any matches of Child Sexual abuse material (CSAM).

According to the company, if any match is located then a human will review the same photograph and then assess and report the person to law enforcement.

Also read: Love Apple products? Head over to Amazon now

This however causes an issue, which is privacy concerns of the users, such that the technology could possibly expand the scan for prohibited content or even political speech. The professionals are saying that this may also be utilized by the government to spy on the natives.

The new versions of Apple i.e. iOS and iPadOS will be released soon and the new applications will already have this feature hence helping to limit the CSAM online, keeping the privacy of the users in mind.

The pictures from US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations will be provided child sexual abuse pictures which will be compiled and used for comparing photographs that enter the iCloud.

The numerical codes of the compiled pictures will be matched to the images present in the iCloud, Apple further stated that the the system will also be able to identify the edited versions of the original photograph.

The company said, “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes."

Further adding, "extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account".

Apple said that the match of any image will be review by a person manually to make sure that there is a match, after confirmation, the company will then disable the account of the user and then report to law enforcement.

Also read: iPhone 12 production to soon start in India, says Apple

Though Apple is saying that the privacy of the users will be the chief concern of the company but still, the experts are concerned about the same.

A security researcher at Johns Hopkins University, Matthew Green said, "Regardless of what Apple's long-term plans are, they've sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content."

Further adding, "Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone."


Trending