Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can pretty trivially generate something using a LLM that I doubt was in the training set or has ever been written down by humans. Do you have reason to believe that's not the case?


Yeah, the fact you cant prove it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: