According to the newspaper, “The Guardian”, the giant American company is also heading for the first time to monitor the content of messages that are sent through end-to-end encryption.
Rights groups welcome this move, which seeks to protect children from exploitation, but they also express concerns about the violation of users’ privacy.
When this technology called “Neural Match” is launched, the photos will be scanned before they are saved to the iCloud storage platform.
In the event that this automated examination alerts that there is a large match between the images that are to be stored, and the images of child abuse in the database, then “Apple” becomes entitled, at that time, to review the matter through its staff.
And if it is confirmed that it is really child pornography, the company “Apple” suspends the user’s account, and then informs the US National Center for Missing and Exploited Children.
But Apple will rely only on the comparison with the images in the database of the US National Center for Missing and Exploited Children.
Experts say that this technology is accurate, because it relies on a database in an official institution, and therefore, parents who take pictures of their young children in the shower do not need to worry, because these pictures will not be viewed as pornographic content.
This automated system does not look at the images directly, but rather relies on mathematical fingerprints representing the content or the image in order to detect any match.
But the problem, according to activists, is that this system that protects children may be harnessed to harm users.
And Matthew Green, a researcher in cryptography at Johns Hopkins University, explains that this method may be used in order to harm innocent people, by sending seemingly innocent images, but it will automatically indicate a match with pornographic images, if they are examined by the “Apple” system. “.
In addition to the “Neural Match” technology, Apple is turning to checking users’ encrypted messages via the iMessage platform, in order to automatically track pornographic images, which allows parents to launch a feature that automatically deletes this harmful content from their children’s phones.