High Tech

Will scan photos on mobile: Apple's plan applauded, but also seen as disturbing

Shkruar nga Anabel

7 Gusht 2021

Will scan photos on mobile: Apple's plan applauded, but also seen as

Apple unveiled future plans to scan iPhones, initially in the US, for images of child sexual abuse, which were applauded by child protection groups but also seen as troubling by security researchers who say can be misused, including by governments that monitor citizens.

The tool created to detect child sexual abuse images, called "neuralMatch", will scan the images before they are uploaded to iCloud.

If you find something suspicious, the image will be reviewed by a man. If child pornography is confirmed, the user account will be deactivated and the National Center for Missing and Exploited Children will be notified.

The detection system, which will also work for other company devices, will only record images that are in the child pornography center database. Parents who take innocent pictures of a child in the bathroom should not worry.

Will scan photos on mobile: Apple's plan applauded, but also seen as

"Apple 's enhanced protection for children has changed the game," said John Clark, president and CEO of the National Center for Missing and Exploited Children, in a statement. "With so many people using Apple products, these new safety measures have the potential to save children."

But some security researchers have raised concerns that there will be abuses, including government surveillance of citizens.

"What happens when the Chinese government says, 'Here is a list we want you to scan,'" Matthew Green, a cryptography researcher at Johns Hopkins University, told Euronews. "Does Apple say no?" "I hope they say no, but their technology does not mean no."