Connect with us

Tech

Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System

Published

on

Apple Hit With .2B Lawsuit Over Abandoned CSAM Detection System

Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual abuse material (CSAM), according to a report from The New York Times.


Filed in Northern California on Saturday, the lawsuit represents a potential group of 2,680 victims and alleges that Apple’s failure to implement previously announced child safety tools has allowed harmful content to continue circulating, causing ongoing harm to victims.

In 2021, Apple announced plans to implement CSAM detection in iCloud Photos, alongside other child safety features. However, the company faced significant backlash from privacy advocates, security researchers, and policy groups who argued the technology could create potential backdoors for government surveillance. Apple subsequently postponed and later abandoned the initiative.

Explaining its decision at the time, Apple said that implementing universal scanning of users’ private iCloud storage would introduce major security vulnerabilities that malicious actors could potentially exploit. Apple also expressed concerns that such a system could establish a problematic precedent, in that once content scanning infrastructure exists for one purpose, it could face pressure to expand into broader surveillance applications across different types of content and messaging platforms, including those that use encryption.

The lead plaintiff in the lawsuit, filing under a pseudonym, said she continues to receive law enforcement notices about individuals being charged with possessing abuse images of her from when she was an infant. The lawsuit argues that Apple’s decision not to proceed with its announced safety measures has forced victims to repeatedly relive their trauma.

In response to the lawsuit, Apple spokesperson Fred Sainz underlined the company’s commitment to fighting child exploitation, stating that Apple is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.” Apple pointed to existing features like Communication Safety, which warns children about potentially inappropriate content, as examples of its ongoing child protection efforts.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Continue Reading