Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is false. Up until now, iCloud photos are encrypted in transit and in store but can be decrypted with Apples own key. There is no e2ee.

Apple has been talking a lot about putting all ML and data handling on your own device so that the data does not need to leave your device without e2ee



I guess you are correct, they are not e2e encrypted. So part of my reasoning does not hold up, they could do this on the server if they wanted.

That I’d prefer to be honest.


Apple does not scan iCloud for csam because they feel that blanket scanning of iCloud violates end user privacy.

That’s the whole reason for their research into differential privacy.


> Apple has confirmed that it’s automatically scanning images backed up to iCloud to ferret out child abuse images.

https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-...


Read the original telegraph article cited.

https://www.telegraph.co.uk/technology/2020/01/08/apple-scan...

>Update January 9, 2020

>This story originally said Apple screens photos when they are uploaded to iCloud, Apple's cloud storage service. Ms Horvath and Apple's disclaimer did not mention iCloud, and the company has not specified how it screens material, saying this information could help criminals.

This confirms the reporting from NYT

https://www.nytimes.com/2021/08/05/technology/apple-iphones-...

>U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: