Apple scans iCloud photos to check for child abuse January 8, 2020 https://www.telegraph.co.uk/technology/2020/01/08/apple-scans-icloud-photos-check-child-abuse/
Oh, I thought that stuff was securely end-to-end encrypted before being uploaded to iCloud. But if Apple can detect even known images of child abuse in iCloud, that's obviously not the case. That's just a complicated way to admit that their end-to-end encryption isn't secure. And it's not that I say that child abuse is OK. It's just that with this vulnerability, you're stuck trusting Apple. Instead of trusting the encryption methodology.
Search warrant shows how Apple tackles child abuse images on iCloud and email February 11, 2020 https://9to5mac.com/2020/02/11/child-abuse-images/
The only defect is the fact that Apple can examine anything that's supposedly encrypted. It's a backdoor, and dishonestly concealed.
I could try to believe them when they claim they use homomorphic encryption, but that doesn't explain the possibility to inspect the images. So, yeah, that sounds like improperly implemented encryption.
Well, I think that they're just encrypting stuff to both the user's key and theirs. But even if they were using homomorphic encryption, that would be just as poor a choice, from the user's perspective.
It may be much worse than it initially sounds. You would, presumably, be able to recognise repeated hashes even if you weren't able to decrypt them. Then, using techniques as they did with the cracking of Enigma, where you might guess some repetitive plaintext, you might be able to recover at least a session key. Point being, if you can get a known hash of the "text", you may well be able to guess and widen.