Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How would they know it is a false positive and not a CSAM image concealed in a document image that NeuralHash was able to "see"? It will be impossible to tell from a lowres picture with certainty.


My understanding is that NeuralHash is only supposed to be looking at visible pixels. Granted, I'm taking their word on that one, but I've not seen any evidence indicating that it does look for stenographically concealed images.


Not necessarily steganography. For example if the image has drastically lowered contrast, the lowres image will look like a false positive, but it will not be.


Okay, but now you're positing a scenario which is entirely pointless. Seriously, what would be the point? If you know you need to conceal the images to put them into your photo library, you surely know it's a stupid idea to put them there in the first place. It's all but impossible to find a supply of CSAM material without being made acutely aware of their illegal status.

Here's what I don't get: who are these people importing CSAM into their camera roll? I for one have never felt the urge to import regular, legal porn into my camera roll. So why would anyone do that with stuff they know could land them in prison? Who the hell co-mingles their deepest darkest dirtiest secret amongst pictures of their family and last night’s dinner?

If someone wants to conceal their CSAM library, I'm sure there's probably dozens of apps in the App Store that can store photos securely behind an additional layer of encryption.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: