Privacy is iPhone. Or so Apple would have you believe in their recent advertising campaigns. Let’s put aside the faintly ridiculous notion that an electronic device can in fact embody privacy, and instead take a look at some recent news Apple has made that would appear to further question their claim.
Apple has just released a feature that scans all your photos and messages on your phone and in iCloud, and uses machine learning to highlight where child pornography is being taken, stored and distributed. Yes, they will sift through all of your private data.
That’s not to say the intended reason for doing so is not valiant. Child abuse is a dispicable crime, and one that not enough can be done to solve. However, this issue is more about the ‘how’ of the method, and what precedent and path does it lead us down?
As Nadim Kobeissi, a cryptographer and founder of the Paris-based cryptography software firm Symbolic Software, said:
“I’m not defending child abuse. But this whole idea that your personal device is constantly locally scanning and monitoring you based on some criteria for objectionable content and conditionally reporting it to the authorities is a very, very slippery slope,”
It asks the fundamental question of should we give up all right to privacy in order to prevent all crimes? And that sounds rather a lot like East Germany following World War 2. Except here it’s not even the government doing the surveilling.
An oft raised point about machine learning and data surfaces again here. What if the machine is wrong? For example, how many of you have shared photos of your own children with your family, in a totally innocent context, where the child might be naked? Now you are potentially at the mercy of an Apple algorithm deciding whether to flag you as a paedophile.
Benny Pinkas, a cryptographer at Israel’s Bar-Ilan University who reviewed Apple’s system believes it’s a good system that helps keep “false positives to a minimum”. I’m not sure I’m comfortable with that level of certainty when the table stakes are so high.
Currently this system is US only, opt-in, and limited to the above use-case. But what precedent does this set on what privacy really means? Should you give up the right to privacy upon committing a nefarious act, or, as in this case, before that act has even been committed? And should the algorithms of a supposedly privacy focussed big tech firm be responsible for judging you?
Let us know your thoughts below? Will you be keeping your iPhone?
Going off the advanced digital grid never looked so appealing.
*heads to storage to dig out old analog phone and Polaroid camera*
Sorry, but “keeping false positives to a minimum” might be OK for Covid19 tests (even if you end up being locked into an overpriced quarantine hotel for no reason), but to be falsely labeled a serious criminal for no reason – no! Minority report here we come!
I mean this all comes down to George Orwell’s 1984, and while Orwell in his book depicts the government will be in control of what we think, he wasn’t far off with the concept in his book.
And and link the state of privacy to this book, because while a bit different, the concept of what we think and how we think is fired back at us by the data that is stored by the big tech giants!
The sooner I get my farm and off grid the better!
Edit due to type o : “And I link the state of privacy to this book, because….l