Privacy is: not actually very private

Privacy is iPhone. Or so Apple would have you believe in their recent advertising campaigns. Let’s put aside the faintly ridiculous notion that an electronic device can in fact embody privacy, and instead take a look at some recent news Apple has made that would appear to further question their claim.

Apple has just released a feature that scans all your photos and messages on your phone and in iCloud, and uses machine learning to highlight where child pornography is being taken, stored and distributed. Yes, they will sift through all of your private data. 

That’s not to say the intended reason for doing so is not valiant. Child abuse is a dispicable crime, and one that not enough can be done to solve. However, this issue is more about the ‘how’ of the method, and what precedent and path does it lead us down?

As Nadim Kobeissi, a cryptographer and founder of the Paris-based cryptography software firm Symbolic Software, said:

“I’m not defending child abuse. But this whole idea that your personal device is constantly locally scanning and monitoring you based on some criteria for objectionable content and conditionally reporting it to the authorities is a very, very slippery slope,” 

It asks the fundamental question of should we give up all right to privacy in order to prevent all crimes? And that sounds rather a lot like East Germany following World War 2. Except here it’s not even the government doing the surveilling.

An oft raised point about machine learning and data surfaces again here. What if the machine is wrong? For example, how many of you have shared photos of your own children with your family, in a totally innocent context, where the child might be naked? Now you are potentially at the mercy of an Apple algorithm deciding whether to flag you as a paedophile. 

Benny Pinkas, a cryptographer at Israel’s Bar-Ilan University who reviewed Apple’s system believes it’s a good system that helps keep “false positives to a minimum”. I’m not sure I’m comfortable with that level of certainty when the table stakes are so high.

Currently this system is US only, opt-in, and limited to the above use-case. But what precedent does this set on what privacy really means? Should you give up the right to privacy upon committing a nefarious act, or, as in this case, before that act has even been committed? And should the algorithms of a supposedly privacy focussed big tech firm be responsible for judging you?

Let us know your thoughts below? Will you be keeping your iPhone?

Further reading

  1. https://www.wired.com/story/apple-csam-detection-icloud-photos-encryption-privacy/