Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Apple's new system is scanning personal property that doesn't belong to them and isn't yet in their cloud.

Apple's new system only scans photos you attempt to upload to their cloud.

Nothing else is scanned.

Scanning the files on server, the way Google and Microsoft do it, means that false positive data is lying around where it can be subpoenaed and used to incriminate innocent people.

>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime

https://www.dailymail.co.uk/news/article-7897319/Police-arre...



>Apple's new system only scans photos you attempt to upload to their cloud.

And what if in the future they decide they need to scan more than images going to the cloud? What if there is some huge epidemic of child abuse or some other terrible thing and Apple decides they need to do more?

Once you open Pandora's Box you can't close it.


What if in the future Google starts selling your location data to anyone willing to write them a check?

Once you open the Pandora's Box of collecting location data, you can't close it.


Both will happen. Basically, Apple is able to scan data on your phone, and Google is able to scan data on their servers.


It doesn't matter if the scan is conditional.

It matters that the capability is there.


Capability has always been there. It is only worded now in a way which made it the most people understand. Speculation about "doing something in hidden", is as valid as before.

In reality, we can be only be mad when they they are publicly making things worse in black box systems. Not about something, which is "policy change" away. Let's be mad when they actually change that policy.


Your whatboutism is profoundly unhelpful here.

Yes, other companies are doing bad things, and they should be stopped.

Doesn't by any stretch of the imagination mean that Apple should be allowed to do something even worse.


Other companies aren't "doing bad things" they are handling scanning the contents of their cloud services in a much more user hostile way.

Keeping that data on their server means it can be subpoenaed and misused.


I mean, we can split hairs over the words to use, but ultimately "immoral and unethical things are being done by big companies that hold all your stuff". The sentiment is the same.

What I'm getting at is that the things Google and Microsoft are doing are entirely irrelevant to the conversation at hand.

Apple is going to compromise your device's privacy in the name of child safety, and will - invariably - eventually cave to pressure to extend that capability well beyond it's originally well-meaning use case.

Stop bringing up what other companies are doing - it is, as I said, entirely irrelevant.


> What I'm getting at is that the things Google and Microsoft are doing are entirely irrelevant to the conversation at hand.

It is not. Industry practices are entirely relevant.


So what is your point then? Is it that Apple's punch in the gut here, while bad/wrong, is beyond criticism or outage because if you use Google, you'll get a slap to the face? Or is it that Apple actually isn't doing anything wrong simply because there is some roughly analogous behavior in your view by other companies?


My point is that Google and Microsoft have been scanning everything in your account (including data like emails and the files you mirror to their cloud drive) and have been doing so for the past decade.

Apple has announced a plan to scan only those photos you upload to iCloud Photos, and nothing else.

Further, Apple's scans will occur on device where a single false positive cannot be misused to incriminate you by anyone who can get a subpoena, because Apple's servers won't hold any data showing something happened.

Google and Microsoft's systems are much more invasive and much less private.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: