Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I treat LLMs like a fallible being, the same way I treat humans.

The issue is LLMs are not marketed in this way. They're marketed as all knowing oracles to people that have been conditioned to just accept the first result Google gives them.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: