Sunday, January 16, 2022

To stop Apples CSAM scanning, more than 5,000 people signed a petition

Apple recently announced that they will soon be rolling out a new feature that will basically search photos stored on the iPhone for media that could depict child abuse. In theory, that sounds like a good thing. And Apple, using its reach and influence to put an end to these things, sounds great.
Since then, however, many have said it was a double-edged sword. And it could potentially be used by the government to crack down on political dissidents and some persecuted groups. In fact, the potential for abuse appears to outweigh the potential benefits, with more than 5,000 people (at the time of publication) signing a petition urging Apple to stop their plans about Apples CSAM scanning.

In an open letter addressed to various security and privacy professionals, researchers, educators, legal professionals, etc. Actions are the norm for most consumer electronics devices and applications. We ask Apple to reconsider the introduction of its technology so as not to abandon this important work. “

In the past we’ve seen backlash against companies forcing them to withdraw or change their policies. But it remains to be see whether Apple will make changes to stop Apples CSAM scanning or rethink its plans.

Related Post: Apple avoids patent infringement of worth $308.5 million

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
Latest news
0
Would love your thoughts, please comment.x
()
x