Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That’s an odd claim, given that we have so-called thinking models. Is there a specific way you have in mind in which LLMs are not thinking processes?


I can call my cat an elephant

it doesn't make him one




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: