‘Technology to automatically scan iPhones threatens democracy’, researchers warn

technology

Share this content

Facebook
Twitter
LinkedIn

Client-side scanning has been called “dangerous technology”, by fourteen of the world’s most respected information security experts. The cryptographers and engineers, whose careers have laid the groundwork for the internet’s fundamental security protocols, have authored a paper titled “Bugs in Our Pockets” to voice their concerns,

According to these experts, introducing the technology recently proposed by Apple as a method of pre-emptively scanning all iPhones for child sexual abuse material (CSAM) – “would be an extremely dangerous societal experiment” that could enable world governments to hold huge surveillance powers against the general public.

What is the technology?

Client-side scanning (CSS) allows you to search for particular files on a personal device without those files having to be shared with other people, as happens with server-side scanning. It is intended to protect the users privacy by preventing other people from seeing innocent images, but the experts argue these protections are not guaranteed.

CSS was the model proposed by Apple for its “critically important child safety features” which were to be launched in the US later this year before being delayed following concerns and controversy. Apple said its feature was designed with privacy protections that would ensure it was “limited to detecting CSAM stored in iCloud”. The company added that it would refuse government demands to use the system to search for images in other criminal or national security investigations.

But the researchers have warned that CSS as a design in general and not just in regards to Apple’s plans that “even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope for other purposes”.

Is this about ‘back door’ access for police?

As a system for searching for files on a device, CSS offers tech companies and law enforcement a solution to the ongoing debate about encryption and public safety. It ostensibly allows users to keep their data private while empowering police to investigate child abuse cases without creating a so-called “back door” that could be abused by criminals – though the researchers warn CSS may still be taken advantage of.

Their main argument is that the introduction of CSS systems “would be much more privacy invasive than previous proposals to weaken encryption” because of the powers it hands to state authorities.

“Rather than reading the content of encrypted communications, CSS gives law enforcement the ability to remotely search not just communications, but information stored on user devices.” Referencing previously suggested solutions to the encryption debate, they wrote: “The proposal to pre-emptively scan all user devices for targeted content is far more insidious than earlier proposals for key escrow and exceptional access.

“Instead of having targeted capabilities such as to wiretap communications with a warrant and to perform forensics on seized devices, the agencies’ direction of travel is the bulk scanning of everyone’s private data, all the time, without warrant or suspicion. That crosses a red line. Is it prudent to deploy extremely powerful surveillance technology that could easily be extended to undermine basic freedoms?” they ask.

For more information, visit:

Newsletter
Receive the latest breaking news straight to your inbox