Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Once again, the notion that everything on the device is scanned under Apple's system was never true.

Only photos you attempt to upload to Apple's iCloud are scanned. If you turn off iCloud photos, NOTHING is scanned.

>Q: So if iCloud Photos is disabled, the system does not work, which is the public language in the FAQ. I just wanted to ask specifically, when you disable iCloud Photos, does this system continue to create hashes of your photos on device, or is it completely inactive at that point?

A: If users are not using iCloud Photos, NeuralHash will not run

https://techcrunch.com/2021/08/10/interview-apples-head-of-p...



> Once again, the notion that everything on the device is scanned under Apple's system was never true.

That isn't what I said.

Also, that's not why most people are so upset. Most people are so upset mainly because Apple has now proven that the capability exists, so they can now be more easily compelled by governments to scan for "extra things".

Prior to this, if a government asked Apple to scan someone's phone, Apple could respond with "we don't have that capability", and it would presumably be a tough legal battle to force a company to add a capability that doesn't exist.

This hurdle is now much lower. The effort has gone from "force Apple to design a new system for scanning phones" to "add these couple of hashes to the pre-existing database".

Also, expanding this from just iCloud upload candidates to the entire device is a very small leap now. I mean, the bad guys could just turn off iCloud, and we must think of the children...

Then you have Apple's "reassurance" that they won't comply with government requests to scan for additional things, which is completely moot considering Apple relies on a third party database and has absolutely no control or idea of what the hashes really are.


The notion that scanning cloud data on device is somehow worse than doing the same thing on server is deeply flawed.

If you have a false positive on device, nothing is sent to Apple's servers. It takes several (possibly false) positives at once to trigger a human review.

If you have a single false positive on server, that data is sitting there where it can be subpoenaed and abused.

Also, recent history shows that Apple is willing to fight government demands to invade user privacy in court.


> Also, recent history shows that Apple is willing to fight government demands to invade user privacy in court.

I can only think of one instance where they did that (the San Bernardino shooter case), and the request was hugely overreaching (the FBI wanted them to compromise their software update signing services), and also they actually DID comply with giving the FBI access to their iCloud data -- just not the software update service.

In fact this report suggests that Apple cooperating with the FBI when it comes to subpoenaing iCloud data is nothing new: https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...


> I can only think of one instance

You might want to Google it then. It’s well known that Apple has been asked and refused multiple times. It’s really easy to find. https://en.wikipedia.org/wiki/FBI–Apple_encryption_dispute

This is a big part of the reason people are surprised and concerned about the scanning program, because it seems like a departure from what Apple has said and done about privacy of iPhone data for the last decade.


This fact is well known and changes nothing. The problem is that the system exists at all. The fine print WILL change - it always does, and that's also a well known fact.


By that logic, Google will definitely begin selling their treasure trove of user data to anyone with a checkbook, because the fine print WILL change.


Yes, that's a very reasonable assumption. I assume all information I give out will be shared beyond my control, unless the recipient promises in writing to protect it and would suffer proportionally if they broke that promise. In practice, this only happens when federal regulations apply (i.e., health care or banking).

If you want to rely on other people behaving a certain way in the future, either form a personal relationship or write up a contract.


It's completely within the realm of possibility that this could happen. A few bad quarters down the road and a leadership change might be all it takes.

Many of us are taking the perspective of decades long changes given our current trajectory.

If not in our time, it could be in our children's time. This is an extremely dangerous system.


Google will maximize the method to monetize user data. They have done that in the past, they will continue to do so in the future.

Collect data and monetize it. That is what google is. They don't provide free email or analytic software out of the goodness of their heart.


It’s a difference in policy vs technical capability. Currently the policy is only scan when iCloud Photos is enabled, but the capability to scan at any/all times is just a policy change away.


No, it's a difference between scanning the files that users store in their respective clouds on-server or on-device.

Scanning on-device (where a single false positive cannot be subpoenaed and misused to incriminate their customers) is simply more private.

>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime

https://www.dailymail.co.uk/news/article-7897319/Police-arre...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: