A new open letter says the proposed tech to detect child sexual abuse material could create a backdoor for surveillance.
More than 90 civil society, digital rights and human rights organisations around the world have today (19 August) published an open letter calling for Apple to abandon its controversial plans to scan devices for child sexual abuse material (CSAM).
The coalition said that the proposed measures, despite being designed to protect children and limit the spread of CSAM, will “create new risks for children and could be used to censor speech and threaten the privacy and security of people around the world”.
The organisations added that “the breadth of the international coalition joining the letter demonstrates the extent to which Apple’s plans open the door to threats to human rights across the globe”. Signatories include the American Civil Liberties Union and the Electronic Frontier Foundation.
The features, announced earlier this month, include the detection of CSAM-related search queries on Siri and Search, where users will be directed to information about why this search is harmful and how they can get help.
More controversially though, the plans also include using machine learning to detect if explicit images are being sent or received by users registered on a family account as being under the age of 13. The child will receive a notification letting them know that this form of material can be harmful and that their parents will be alerted if they choose to view or send the image.
The open letter published today noted that “algorithms designed to detect sexually explicit material are notoriously unreliable” and can mistakenly flag art, health information, educational resources and other imagery. It added that the corresponding parental notification feature is open to abuse. “LGBTQ+ youths on family accounts with unsympathetic parents”, it said, are “particularly at risk”.
The part of Apple’s plan that has raised the most concern is more complex. Using cryptographic principles, users’ content uploaded to iCloud will be checked against existing CSAM in the US National Centre for Missing and Exploited Children database. If the system finds that CSAM is being collected in a user’s iCloud, it plans to alert authorities.
The open letter said that this could create a precedent of adding image-scanning backdoors to Apple’s systems, and “governments could compel Apple to extend notification to other accounts” for purposes other than child protection.
Last week, Apple said it would “refuse any such demands” from governments to add non-CSAM images to its new tech process.
But the letter raised concerns that Apple and other tech companies “will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable”.
“Those images may be of human rights abuses, political protests … or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud,” it added.
“Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis.”
Despite the company’s claims that it will always refuse government pressure to aid in repression, The Citizen Lab reported this week that the company bans many political phrases from iPhone engravings in China, Taiwan and Hong Kong.
The letter concluded that while the proliferation of CSAM is undoubtedly a serious issue and that efforts to protect children are laudable, Apple’s plans “put children and its other users at risk, both now and in the future”. The coalition asked the company to abandon the plans and work with civil society groups in future on issues of privacy.
This is the latest in a slate of criticism the tech giant has received for the plans, including from its own employees. Rules for detecting child abuse content online passed by the European Parliament in July drew similar responses.