Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My point is that there is precedent for governments requiring companies to implement restrictions on what images can be handled by their software.

As I explained: This kind of mandated restriction is looming over AI. Companies are trying to get out in front of these restrictions so they can implement them on their own terms.



>My point is that there is precedent for governments requiring companies to implement restrictions on what images can be handled by their software

But images of boobs are still legal. So this NSFW filter seems to be much more above then the law asks. Is the issue is that even if you do not train with CP you might get the model so output something that some random person will get offended and label it as CP? I assume that other companies can focus on NSFW and have their lawyers figure this out, IMo would be cool that someone sues the governments and make them reveal facts about their concern that CP of fake or cartoon people is dangerous" , I think they could focus on saving real children then cartoon ones.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: