Friday, April 19, 2024

As early as 2019, Apple was seeking for CSAM

Recently, Apple announced that it would introduce a CSAM photo scanning feature for iCloud photos. While it’s good that Apple uses its influence and capabilities to try to stop such activity. Many believe that the system could be abused by the government for material unrelated to CSAM. Apple confirmed to 9to5Mac in an exclusive report that As early as 2019, Apple was seeking for CSAM.
There are also some who think that even though there is nothing to hide. It still feels like an invasion of privacy. It’s worth noting, however, that this isn’t the first rodeo to scan Apple’s CSAM. Apple confirmed to 9to5Mac in an exclusive report that it was seeking scanning iCloud email for CSAM material at least in 2019 according to the claim.

According to an archive version of Apple’s child safety site, it actually says, “We’ve built robust safeguards at every level of our software platform and across our supply chain. Apple employs image-matching technologies to detect and report child exploitation as part of this commitment. Our technology employs electronic signatures to identify potential child exploitation, similar to email spam filters.

As 9to5Mac notes, email hasn’t encrypted, so Apple doesn’t have much trouble scanning email as it passes through their servers. So if you are worry about this upcoming feature, you should know that Apple has done it to some extent.

Latest news