Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are we being pranked? I just followed the steps but the image output from my prompt is just a single frame of Rick Astley...

EDIT: It was a false-positive (honest!) on the NSFW filter. To disable it, edit txt2img.py around line 325.

Comment this line out:

    x_checked_image, has_nsfw_concept = check_safety(x_samples_ddim)
And replace it with:

    x_checked_image = x_samples_ddim


That means the NSFW filter kicked in IIRC from reading the code.

Change your prompt, or remove the filter from the code.


Haha, busted!


To be fair, the reason the filter is there is that if you ask for a picture of a woman, stable diffusion is pretty likely to generate a naked one!

If you tweak the prompt to explicitly mention clothing, you should be OK though.


Wow, is that true? I’ve never heard a more textbook ethical problem with a model.


Safari blocking searches for "asian" probably had more impact: https://9to5mac.com/2021/03/30/ios-14-5-no-longer-blocks-web...


It's an ethical problem with our society, not the model.


If you consider that the training set includes all western art from the last few centuries it’s not too surprising. There’s an awful lot of nudes in that set & most of them are female.


If you open up the script txt2img and img2img scripts, there is a content filter. If your prompt generated anything that gets detected as "inappropriate" the image is replaced with Rick Astley.

Removing the censor should be pretty straightforward, just comment out those lines.


It bothers me that this isn't just configurable. Why would they not want to expose this as a feature?


Plausible deniability


When the model detects NSFW content it replaces the output with the frame of Rick Astley.


It's kind of amazing that ML can now intelligently rick roll people.

I think it would be awesome to update the rickroll feature to the following:

Auto Re-run the img2img with some text prompt: "all of the people are now Rick Astley" with low strength so it can adjust the faces, but not change the nudity!!!1


Hah, it would be hilarious if it generated all the nudity you wanted - but with Rick Astley's face on every naked person!


To be fair, the developers added this "feature" and can easily be disabled in the code. The ML just says "this might be NSFW".


Same thing happened to me which is especially odd as I literally just pasted the example command.


It has a lot of false positives. A lot of my portraits of faces were marked as NSFW. Possibly detecting proportion of the image that's skin color?


Unrelated to stable diffusion, but I was showing DALL-E to my sister last night and a prompt with > Huge rubber tree set off the TOS violation filter.

AI alignment concerns are definitely overblown...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: