APPLE's new iPhone technology that will scan your phone for pictures of child sex abuse is facing a backlash over privacy concerns.
Thousands of people have signed a petition demanding that the tech giant halt plans amid worries it could poses a security risk.
Under the proposals, technology experts worry that the system could be misused – particularly by governments who may be looking to spy on their citizens.
HOW WILL IT AFFECT USERS?
In an open letter penned by a group of technology, security and legal experts to Apple they warned that the scheme threatened to undo decades of work to keep users' privacy safe from the technology they use.
“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases," the letter reads.
"We ask that Apple reconsider its technology rollout, lest it undo that important work."
The writers of the open letter said that, while child exploitation is a serious problem, Apple's proposal "introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products."
They said that the technology "sets a precedent where our personal devices become a radical new tool for invasive surveillance".
They blasted the project for its lack of oversight to help prevent eventual abuse and "unreasonable expansion" of the scope of surveillance.
The group requests that Apple halts the deployment of its monitoring technology immediately and issues a statement reaffirming their commitment to user privacy.
Signatories include major organisations such as The New York Public Library and privacy groups including the Privacy Foundation.
HOW WILL THE TECHNOLOGY WORK?
The tool, called neuralMatch, is designed to detected known images of child sexual abuse and will scan such images before they are uploaded to iCloud.
If the system finds a match, the image will be reviewed by a human.
Once child pornography has been confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will, however, only flag images that are already in the center's database of known child pornography.
Apple insists: "Any time an account is flagged by the system, Apple conducts human review before making a report to the National Center for Missing and Exploited Children.
"As a result, system errors or attacks will not result in innocent people being reported to NCMEC.
"The feature is designed so that Apple does not get access to [text] messages,"
SPYING CONCERNS
But Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography.
That could fool Apple's algorithm and alert law enforcement, Green said.
He added that researchers have been able to trick such systems pretty easily.
Other abuses could include government surveillance of dissidents or protesters.
"What happens when the Chinese government says, 'Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology wont say no."
WhatsApp chief Will Cathcart has also chimed in, saying that his messaging app would not be adopting the safety measures and calling Apple's approach "very concerning".
GOVERNMENT PRESSURE
The move has drawn applause from child protection groups.
"Apple's expanded protection for children is a game changer," John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement.
"With so many people using Apple products, these new safety measures have lifesaving potential for children."
Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images.
Apple has used those to scan user files stored in its iCloud service – which is not as securely encrypted as its on-device data – for child pornography.
The company has been under government pressure for years to allow for increased surveillance of encrypted data.
Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.
Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.
"Apple's expanded protection for children is a game changer," John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children.
Meanwhile the Electronic Frontier Foundation, the online civil liberties pioneer, called Apple's compromise on privacy protections a shocking about-face for users who have relied on the company's leadership in privacy and security.
Source: Read Full Article