Show caption Apple iPhones will soon start be able to detect images containing child sexual abuse material and report them as they are uploaded to iCloud. Photograph: Loïc Venance/AFP/Getty Images Opinion Will Apple’s image-scan plan protect children or just threaten privacy? John Naughton The tech giant says its iCloud security update is designed to help weed out images of abuse their children, but activists have voiced concerns Sat 14 Aug 2021 16.00 BST Share on Facebook
Share on Twitter
Share via Email
Once upon a time, updates of computer operating systems were of interest only to geeks. No longer – at least in relation to Apple’s operating systems, iOS and Mac OS. You may recall how Version 14.5 of iOS, which required users to opt in to tracking, had the online advertising racketeers in a tizzy while their stout ally, Facebook, stood up for them. Now, the forthcoming version of iOS has libertarians, privacy campaigners and “thin-end-of-the-wedge” worriers in a spin.
It also has busy mainstream journalists struggling to find headline-friendly summaries of what Apple has in store for us. “Apple is prying into iPhones to find sexual predators, but privacy activists worry governments could weaponise the feature” was how the venerable Washington Post initially reported it. This was, to put it politely, a trifle misleading and the first three paragraphs below the headline were, as John Gruber brusquely pointed out, plain wrong.
To be fair to the Post though, we should acknowledge that there is no single-sentence formulation that accurately captures the scope of what Apple has in mind. The truth is that it’s complicated; worse still, it involves cryptography, a topic guaranteed to lead anyone to check for the nearest exit. And it concerns child sexual abuse images, which are (rightly) one of the most controversial topics in the online world.
A good place to start, therefore, is with Apple’s explanation of what it’s trying to do. Basically: three things. The first is to provide tools to help parents manage their childrens’ messaging activity. (Yes, there are families rich enough to give everyone an iPhone!) The iMessage app on children’s phones will use its built-in machine-learning capability to warn of inappropriate content and alert their parents. Second, the updated operating systems will use cryptographic tools to limit the spread of CSAM (child sexual abuse material) on Apple’s iCloud storage service while still preserving user privacy. (If this sounds like squaring a circle, then stay tuned.) And third, Apple is providing updates to Siri and search to help parents and children if they encounter unsafe material. This third change seems relatively straightforward. It’s the other two that have generated the most heat.
Photographs are not going to be scanned by attempting to analyse the image per se, but by checking its crypto-signature
The first change is controversial because it involves stuff happening on people’s iPhones. Well, actually, on phones used by children in a shared family account. If the machine-learning algorithm detects a dodgy message the photo will be blurred and accompanied by a message warning the user that if s/he does view it then their parents will be notified. The same applies if the child attempts to send a sexually explicit photograph.
But how does the system know if an image is sexually explicit? It seems to do it by seeing if it matches images in a database maintained by the US National Center for Missing and Exploited Children (NCMEC). Every image on that grim database has a unique cryptographic signature – an incomprehensible long number – in other words, the kind of thing that computers are uniquely good at reading. This is the way photographs on iCloud are going to be scanned, not by attempting to analyse the image per se, but just by checking its crypto-signature. So Apple’s innovation is to do it “client-side” (as tech jargon puts it), checking on the device as well as in the Cloud.
It’s this innovation that has rung most alarm bells among those concerned about privacy and civil rights, who see it as undermining what has hitherto been an impressive feature of iMessage – its end-to-end encryption. The Electronic Frontier Foundation, for example, views it as a potential “back door”. “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” it warns. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses. That’s not a slippery slope – that’s a fully built system just waiting for external pressure to make the slightest change.”
Before getting too steamed up about it, here are a few things worth bearing in mind. You don’t have to use iCloud for photographs. And while Apple will doubtless try to claim the moral high ground – as usual – it’s worth noting that it has to date seemed relatively relaxed about what was on iCloud. The NCMEC reports, for example, that in 2020 Facebook reported 20.3m images to it, while Apple reported only 265. So could its brave new update be just about playing catch-up? Or a pre-emptive strike against forthcoming requirements for reporting by the UK and the EU? As the Bible might put it, corporations move in mysterious ways, their wonders to perform.
What I’ve been reading
Vaclav Smil: We Must Leave Growth Behind is the transcript of an interview by David Wallace-Wells recorded after the publication of Smil’s magisterial book on growth.
Surely We Can Do Better Than Elon Musk is a fabulous long read by Nathan J Robinson on the Current Affairs site.
Teenage Loneliness and the Smartphone is a sombre New York Times essay by Jonathan Haidt and Jean Twenge, who have spent years studying the effect of smartphones and social media on our daily lives and mental health.