Apple facing lawsuit over dropped child sexual abuse detection features

9 Dec 2024

Image: © SasinParaksa/Stock.adobe.com

The company initially planned to crack down on crimes against children in 2021, however, it backtracked over alleged privacy concerns.

Tech multinational Apple is at the centre of a lawsuit brought on behalf of an unnamed plaintiff, accusing the firm of dropping child sexual abuse material (CSAM) detection features and privacy-washing its obligations.

The 27-year-old plaintiff, who has filed under a pseudonym, stated that when she was an infant she was sexually assaulted by a relative who shared images online and she now receives law enforcement notices, almost daily, saying that someone has been charged with possessing the content.  

This is not the first suit of its kind to be brought against Apple. Earlier this year, a minor plaintiff referred to as Jane Doe filed with the US District Court for the Northern District of California, stating that she was coerced into making and uploading CSAM on an iPad, with images and videos stored to iCloud. 

The class-action suit claimed that Apple was hiding behind privacy concerns in order to abdicate responsibility towards preventing the storage of CSAM on iCloud. 

In 2021, Apple attempted to introduce measures on iOS systems that would detect CSAM images, alert guardians about explicit content on a child’s device and notify law enforcement that CSAM was being stored over the cloud, however, there was backlash from privacy advocates.

A month after first approaching the topic, Apple announced that it would be postponing the introduction of new features, amid concerns that the advanced technology could be used to spy on anyone and enable government surveillance. 

In 2022, the company launched the ‘Communication safety in Messages’ feature, which would alert children to the dangers of sending or receiving content that included nudity. This would blur the image, warn the child and offer information as to how they can reach out to trusted adults for help. 

The current lawsuit says that 2,680 other people are eligible to be part of the case and have been affected by Apple’s failure to deploy anti-CSAM protections and it calls upon the courts to compel Apple to compensate those harmed and change its practices.

A spokesperson from Apple stated, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.

“Features like communication safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.”

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Updated, 1:57pm, 9 December 2024: This article was amended to include a comment from Apple. 

Laura Varley is the Careers reporter for Silicon Republic

editorial@siliconrepublic.com