Apple’s CSAM Woes Continue After Being Accused Of Under-Reporting Incidents

It looks like iPhone maker Apple’s Child Sexual Abuse Material (CSAM) controversy is not going away anytime soon.

The tech giant was just accused by a leading UK-based charity who says the company is not reporting the reality of the situation, including the exact figure of incidents taking place on its apps.

The report seems to be more related to the failure to understand how the world of E2E encryption works. Moreover, the NSPCC in the UK pointed out two serious discrepancies involving Apple’s reporting of CSAM incidents.

The first one has to do with how there’s a major difference in the figure of cases that Apple and other tech giants mentioned, as delineated in The Guardian.

Moreover, Apple highlighted how last year, there were only 267 reports of such cases on its apps around the world. That’s very interesting because it’s a stark difference from what the British charity found for its arch-rivals of the industry.

For instance, Google reports a whopping 1.4 million while 30.6 million were seen on Meta during the same time as explained through the annual report.

The second major discrepancy mentioned had to do with more CSAM convictions entailing Apple’s lineup of products and services in the English and Wales region when compared to reporting done worldwide.

To be more specific, Apple mentioned that between April 2022 and March 2023, a whopping 337 recorded cases featuring child abuse pictures involved Apple.

To see the stark difference in several British child abuse crimes arising on Apple’s services and the reports being outlined globally to authorities. This is a serious concern because it gives off the impression that perhaps the Cupertino firm is downplaying how serious of a matter this situation is.

The general consensus on this subject is that Apple continues to fall behind other arch-rivals in terms of safety and rolling out the best practices to protect minors from such incidents in the first place. Also, it’s going against the country’s Online Safety Act.

While some of these court cases being cited by NSPCC are per CSAM content forwarded in its FaceTime and iMessage, it’s about time Apple did something to combat the growing issue.

E2E encryption means the tech giant is unable to see the messages and what they entail due to privacy reasons. Hence, it’s failing to report such cases in the end. Moreover, the cases are always brought after the offenders were arrested for other reasons and hence were needed to give access to devices.

Another major issue that people are having issues talking about is iCloud. All such services keep getting scanned routinely for the likes of digital fingerprints of CSAM content that’s known in the client’s uploads. However, Apple itself remains clueless but the question is for how much longer can this be ignored.

Image: DIW-Aigen

Read next: Meta Introduces New Llama 3.1 Family Of AI Models As Mark Zuckerberg Explains Why Open-Source AI Is The Future
Previous Post Next Post