Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Dont upload these harmless images to iCloud as Apple will assume its Child Porn (CSAM)

This is not true. They may match the hash, but the will not match the visual derivative.

The system is not as easily fooled as you think.



We've learned from YouTube how well matching content works well. Apple will be better right? Right?


Yes. We know that it is based on the papers explaining how it works.


> The system is not as easily fooled as you think.

I would like to believe that is true, but the negative consequences of even generating a false-positive is enough to not attempt to upload any image.


I tend to disagree here...

Based on the documentation from Apple, they are waiting to get *several* matches, *not only one* (we don't know what is *several* but I don't expect something like <= 3 pictures). Once the rate has been reached, they ask to a physical team to review the "positive matches", and deliberate if, yes or no, the images are CSAM or not.

If yes, after the manual process, the authorities are called.


Hypothetically, what happens if a viral event should persuade people to mass upload these images? Would Apple ad hoc modify their review protocol?


Nothing because these files won’t trigger a match.


you mean like how people got so fed up with ToS-mandated arbitration that they all decided to file motions simultaneously

it worked that time...


The consequence is that someone at Apple reviews the case, notices that its a false positive, and closes the case.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: