Apple delays plan to scan iPhones and iPads for a collection of child sexual abuse images following backlash from customers and privacy officials, the tech giant said on Friday.
Last month, the company announced a feature aimed at flagging child sexual abuse images that users store on their iCloud servers. Apple delays plan but the giant did not told how much time delay process will take.
“Earlier, we announced plans for features to help protect children from predators who use communication means to recruit and exploit them, and limit the distribution of child sexual abuse material,” said a statement in a company statement.
“Based on feedback from customers, stakeholders, researchers and others, we decided to spend some extra time in the coming months gathering information and making improvements before introducing this important child safety feature.”
The system is designed to search for images that match those in libraries that have been collected by law enforcement agencies to detect and track the spread of child abuse material on the internet.
CEO Child Abusing Coalition Statement
“We highly value privacy and want to avoid any form of government mass surveillance, but ignoring children and ignoring child sexual abuse survivors by saying we won’t go to extremes for famous videos and images, a future that may never come seems completely wrong to us. me,” said Glenn Pounder, chief operating officer of the Child Rescue Coalition, a nonprofit that develops software to help organs protect against child sexual abuse.
Dan Nelson O. Bunn Jr., executive director of the National Association of District Attorneys, attacked privacy attorneys who, in his opinion, failed to explain “how child protection and criminal prosecution are irreconcilable with “Apple customers” privacy concerns.”
Farid, who developed Microsoft’s PhotoDNA tool in 2009 to help police and technology companies find and remove known child sexual exploitation images, is critical of Apple’s plans to focus on images rather than video – which accounts for the bulk of this abuse.
“They were 10 years late with the game, at most they solved less than half the problem, and you can get past the technology very quickly just by not storing 30 or so illegal images in iCloud,” said Farid.