Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What if instead of a license most won’t respect, you include a poison pill in the repo or other code storage to poison the model?

https://news.ycombinator.com/item?id=45529587

https://news.ycombinator.com/item?id=45533842



This should be about get permission from open-source developers before feeding their years of work into AI. I think we should not believe what Anthropic, OpenAI, Meta, Google tells.

We should move to local LLMs.


How do local LLMs help?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: