Apple scans iCloud photos to check for child abuse

Discussion in 'privacy general' started by guest, Jan 9, 2020.

  1. guest

    guest Guest

    Apple scans iCloud photos to check for child abuse
    January 8, 2020
    https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/
     
  2. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    9,252
    Oh, I thought that stuff was securely end-to-end encrypted before being uploaded to iCloud.

    But if Apple can detect even known images of child abuse in iCloud, that's obviously not the case.
    That's just a complicated way to admit that their end-to-end encryption isn't secure.

    And it's not that I say that child abuse is OK.

    It's just that with this vulnerability, you're stuck trusting Apple. Instead of trusting the encryption methodology.
     
  3. guest

    guest Guest

    Search warrant shows how Apple tackles child abuse images on iCloud and email
    February 11, 2020
    https://9to5mac.com/2020/02/11/child-abuse-images/
     
  4. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    9,252
    The only defect is the fact that Apple can examine anything that's supposedly encrypted.

    It's a backdoor, and dishonestly concealed.
     
  5. Nebulus

    Nebulus Registered Member

    Joined:
    Jan 20, 2007
    Posts:
    1,635
    Location:
    European Union
    I could try to believe them when they claim they use homomorphic encryption, but that doesn't explain the possibility to inspect the images. So, yeah, that sounds like improperly implemented encryption.
     
  6. mirimir

    mirimir Registered Member

    Joined:
    Oct 1, 2011
    Posts:
    9,252
    Well, I think that they're just encrypting stuff to both the user's key and theirs.

    But even if they were using homomorphic encryption, that would be just as poor a choice, from the user's perspective.
     
  7. deBoetie

    deBoetie Registered Member

    Joined:
    Aug 7, 2013
    Posts:
    1,832
    Location:
    UK
    It may be much worse than it initially sounds. You would, presumably, be able to recognise repeated hashes even if you weren't able to decrypt them. Then, using techniques as they did with the cracking of Enigma, where you might guess some repetitive plaintext, you might be able to recover at least a session key.

    Point being, if you can get a known hash of the "text", you may well be able to guess and widen.
     
  8. Rasheed187

    Rasheed187 Registered Member

    Joined:
    Jul 10, 2004
    Posts:
    17,546
    Location:
    The Netherlands
    I haven't got any problems with this, if it can be used to stop crime.
     
  9. guest

    guest Guest

    Report: Apple to announce client-side photo hashing system to detect child abuse images in users’ photos libraries
    August 5, 2021
    https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning/
     
  10. guest

    guest Guest

    Apple says it will begin scanning iCloud Photos for child abuse images
    August 5, 2021
    https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning/
     
  11. Palancar

    Palancar Registered Member

    Joined:
    Oct 26, 2011
    Posts:
    2,402
    My gosh, are there really people stupid enough to place "child porn" pics in their cellphone ISP cloud? I guess we should be happy to catch these folks, but seriously, such a lack of brain horsepower!!
     
  12. ProTruckDriver

    ProTruckDriver Registered Member

    Joined:
    Sep 18, 2008
    Posts:
    1,444
    Location:
    "An Apple a Day, Keeps Microsoft Away"
    Yes.
     
  13. hawki

    hawki Registered Member

    Joined:
    Dec 17, 2008
    Posts:
    6,065
    Location:
    DC Metro Area
    Not sure if this is something else or if the articles cited above got it wrong -- The scanning involves a client-side scanning of the user's phone:

    "Apple is prying into iPhones to find sexual predators, but privacy activists worry governments could weaponize the feature...

    Apple unveiled a sweeping new set of software tools Thursday that will scan iPhones and other devices for child pornography and text messages with explicit content and report users suspected of storing illegal pictures on their phones to authorities...

    The move also raises new questions about the nature of smartphones and who really owns the computers in their pockets. The new software will perform scans on its users’ devices without their knowledge or explicit consent, and potentially put innocent users in legal jeopardy...

    The software uses a matching technique, where photos stored on iPhones will be scanned and then compared with known child pornography. Before a photo can be uploaded to iCloud, Apple’s online storage service, it will be given a “voucher” ensuring that it is not child pornography..."

    https://www.washingtonpost.com/technology/2021/08/05/apple-child-pornography-iphone/
     
  14. caspian

    caspian Registered Member

    Joined:
    Jun 17, 2007
    Posts:
    2,363
    Location:
    Oz
    I guess the biggest problem is flagging something by mistake and also the human review may determine something is illegal that is not. I've read reports of this. Family pictures etc being falsely flagged and causing much pain to family members before it is resolved. The other thing of course is that this technique can be used for other things.
     
  15. guest

    guest Guest

    Privacy Whistleblower Edward Snowden and EFF Slam Apple's Plans to Scan Messages and iCloud Images
    August 6, 2021
    https://www.macrumors.com/2021/08/06/snowden-eff-slam-plan-to-scan-messages-images/
     
  16. guest

    guest Guest

    Open letter demanding Apple stop deployment of photo scanning tech gets 5,000 signatures
    August 9, 2021
    https://www.neowin.net/news/open-le...-of-photo-scanning-tech-gets-5000-signatures/
     
  17. guest

    guest Guest

    Apple says its new child safety feature will look for images flagged in multiple countries
    August 14, 2021
    https://www.neowin.net/news/apple-s...ook-for-images-flagged-in-multiple-countries/
     
  18. guest

    guest Guest

    Corellium Wants to Help Independent Researchers Investigate Apple’s Controversial Photo Scanning Tool
    August 17, 2021
    https://www.iphonehacks.com/2021/08...investigate-apple-photo-scanning-feature.html
     
  19. guest

    guest Guest

    Researchers produce collision in Apple’s child-abuse hashing system
    August 18, 2021
    https://www.theverge.com/2021/8/18/...ash-collision-vulnerability-flaw-cryptography
     
  20. xxJackxx

    xxJackxx Registered Member

    Joined:
    Oct 23, 2008
    Posts:
    8,625
    Location:
    USA
    This is why they are pushing everyone to cloud services. So they can inspect your stuff at will in the name of catching criminals. False positives will result in some 3 letter agency kicking in your front door and dragging you out with all of your devices until you can prove it was a mistake.
     
  21. reasonablePrivacy

    reasonablePrivacy Registered Member

    Joined:
    Oct 7, 2017
    Posts:
    2,002
    Location:
    Member state of European Union
    Exactly. Family photos, photos taken by medical doctor i.e. pediatrician. Somebody may also send you a photo just to make your device snitch on you in similar way swatting is performed.
    This system can be extended.
    And last but not least: who will make sure random Apple employee does not abuse the power?
     
  22. guest

    guest Guest

    Policy groups request Apple abandon plans to scan devices for child abuse imagery
    August 19, 2021
    https://www.theverge.com/2021/8/19/22632722/apple-child-abuse-scan-ios-abandon-request-privacy
     
  23. guest

    guest Guest

    Apple already scans iCloud Mail for CSAM, but not iCloud Photos
    August 23, 2021
    https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/
     
  24. guest

    guest Guest

    Edward Snowden calls Apple CSAM plans 'disaster-in-the-making'
    August 26, 2021
    https://appleinsider.com/articles/2...calls-apple-csam-plans-disaster-in-the-making
     
  25. Gandalf_The_Grey

    Gandalf_The_Grey Registered Member

    Joined:
    Jan 31, 2012
    Posts:
    1,188
    Location:
    The Netherlands
    AdGuard Blog: Is there a price too high for children's safety? Our comment on Apple's new controversial feature
    https://adguard.com/en/blog/apple-csam-detection.html
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.