Apple to scan iCloud photos for child abuse images, starting from the United States.
Financial Times reported Apple is making plans to scan photos stored in iPhones and iCloud to find out Child Abuse images. The verge said that this new system can help in criminal investigations and law enforcement. But the new system can also increase the government pressure and demand for user data.
Apple called this system neuralMatch, neuralMatch will alert a human team if any illegal child abuse image is detected. The team would then contact law enforcement agencies if they truly detect materials as child abuse.
Apple trained neuralMatch algorithm using 200,000 images from National Center for Missing & Exploited Children. The system will hash photos and compare them with images (missing and exploited) stored in the database.
“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not, Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
– The Financial Times
This system is being considered as an approach to handle child abuse, at the same time it arose security concerns. Cryptographer Matthew Green from John Hopkins university raised concerns about neuralMatch. He said: “his sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?“
Financial Times report that Apple is likely to share more details about the neural arch, this week. Apple briefed two security researchers earlier. Apple is very much concerned about the privacy protection Apple built into its devices. Some time ago Apple stood against the FBI in a case that FBI wanted Apple to create a backdoor in an iOS device linked to a criminal case. Let’s see what happens when apple will share more details and clear raised concerns.
We would highly appreciate your follow.