Apple scans iCloud photos to check for child abuse January 8, 2020 https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/
Oh, I thought that stuff was securely end-to-end encrypted before being uploaded to iCloud. But if Apple can detect even known images of child abuse in iCloud, that's obviously not the case. That's just a complicated way to admit that their end-to-end encryption isn't secure. And it's not that I say that child abuse is OK. It's just that with this vulnerability, you're stuck trusting Apple. Instead of trusting the encryption methodology.
Search warrant shows how Apple tackles child abuse images on iCloud and email February 11, 2020 https://9to5mac.com/2020/02/11/child-abuse-images/
The only defect is the fact that Apple can examine anything that's supposedly encrypted. It's a backdoor, and dishonestly concealed.
I could try to believe them when they claim they use homomorphic encryption, but that doesn't explain the possibility to inspect the images. So, yeah, that sounds like improperly implemented encryption.
Well, I think that they're just encrypting stuff to both the user's key and theirs. But even if they were using homomorphic encryption, that would be just as poor a choice, from the user's perspective.
It may be much worse than it initially sounds. You would, presumably, be able to recognise repeated hashes even if you weren't able to decrypt them. Then, using techniques as they did with the cracking of Enigma, where you might guess some repetitive plaintext, you might be able to recover at least a session key. Point being, if you can get a known hash of the "text", you may well be able to guess and widen.
Report: Apple to announce client-side photo hashing system to detect child abuse images in users’ photos libraries August 5, 2021 https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning/
Apple says it will begin scanning iCloud Photos for child abuse images August 5, 2021 https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/
My gosh, are there really people stupid enough to place "child porn" pics in their cellphone ISP cloud? I guess we should be happy to catch these folks, but seriously, such a lack of brain horsepower!!
Not sure if this is something else or if the articles cited above got it wrong -- The scanning involves a client-side scanning of the user's phone: "Apple is prying into iPhones to find sexual predators, but privacy activists worry governments could weaponize the feature... Apple unveiled a sweeping new set of software tools Thursday that will scan iPhones and other devices for child pornography and text messages with explicit content and report users suspected of storing illegal pictures on their phones to authorities... The move also raises new questions about the nature of smartphones and who really owns the computers in their pockets. The new software will perform scans on its users’ devices without their knowledge or explicit consent, and potentially put innocent users in legal jeopardy... The software uses a matching technique, where photos stored on iPhones will be scanned and then compared with known child pornography. Before a photo can be uploaded to iCloud, Apple’s online storage service, it will be given a “voucher” ensuring that it is not child pornography..." https://www.washingtonpost.com/technology/2021/08/05/apple-child-pornography-iphone/
I guess the biggest problem is flagging something by mistake and also the human review may determine something is illegal that is not. I've read reports of this. Family pictures etc being falsely flagged and causing much pain to family members before it is resolved. The other thing of course is that this technique can be used for other things.
Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images August 6, 2021 https://www.macrumors.com/2021/08/06/snowden-eff-slam-plan-to-scan-messages-images/
Open letter demanding Apple stop deployment of photo scanning tech gets 5,000 signatures August 9, 2021 https://www.neowin.net/news/open-le...-of-photo-scanning-tech-gets-5000-signatures/
Apple says its new child safety feature will look for images flagged in multiple countries August 14, 2021 https://www.neowin.net/news/apple-s...ook-for-images-flagged-in-multiple-countries/
Corellium Wants to Help Independent Researchers Investigate Apple’s Controversial Photo Scanning Tool August 17, 2021 https://www.iphonehacks.com/2021/08...investigate-apple-photo-scanning-feature.html
Researchers produce collision in Apple’s child-abuse hashing system August 18, 2021 https://www.theverge.com/2021/8/18/...ash-collision-vulnerability-flaw-cryptography
This is why they are pushing everyone to cloud services. So they can inspect your stuff at will in the name of catching criminals. False positives will result in some 3 letter agency kicking in your front door and dragging you out with all of your devices until you can prove it was a mistake.
Exactly. Family photos, photos taken by medical doctor i.e. pediatrician. Somebody may also send you a photo just to make your device snitch on you in similar way swatting is performed. This system can be extended. And last but not least: who will make sure random Apple employee does not abuse the power?
Policy groups request Apple abandon plans to scan devices for child abuse imagery August 19, 2021 https://www.theverge.com/2021/8/19/22632722/apple-child-abuse-scan-ios-abandon-request-privacy
Apple already scans iCloud Mail for CSAM, but not iCloud Photos August 23, 2021 https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/
Edward Snowden calls Apple CSAM plans 'disaster-in-the-making' August 26, 2021 https://appleinsider.com/articles/2...calls-apple-csam-plans-disaster-in-the-making
AdGuard Blog: Is there a price too high for children's safety? Our comment on Apple's new controversial feature https://adguard.com/en/blog/apple-csam-detection.html