Last week, Apple announced the launch of a new CSAM scanning feature for iOS. The feature will scan photos on iPhones to identify potential child abuse images and then report them to authorities. However, there has been quite a negative reaction following the announcement that Apple might expect to hear.
Unfortunately, almost nothing has changed about it. According to a report, the reporters contacted Apple to see if the negative feedback they received might affect their plans. But apparently not because Apple is still looking to make a new Launch of CSAM scanning feature this year along with iOS 15.
It’s actually a good thing that Apple’s CSAM pictures have scanned. Especially considering the millions of devices it has sold to date. This gives them tremendous leeway and we believe that we may have a long way to go to help the authorities find people involved in such activities.
However, many privacy and legal experts point to the potential abuse of such systems. The government could trick Apple into eventually forcing them to look for other types of images.
Apple tried to allay these concerns by saying that the feature would never exceed CSAM and denying all other requests. It remains to be seen if they will, and whether or not they will stick around will only be known with time. But in the meantime, consumers can expect this until the end of the year.
Related Post: To stop Apples CSAM scanning, more than 5,000 people signed a petition