Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even in this case loosing $200 + whatever vs a tiny bit higher chance of loosing $20 + whatever makes pro seem a good deal.


Doesn't that completely depend on those chances and the magnitude of +whatever?

It just seems to me that you really need to know the answer before you ask it to be over 90% confident in the answer. And the more convincing sounding these things get the more difficult it is to know whether you have a plausible but wrong answer (aka "hallucination") vs a correct one.

If you have a need for a lot of difficult to come up with but easy to verify answers it could be worth it. But the difficult to come up with answers (eg novel research) are also where LLMs do the worst.


Compared to know things and not loosing whatever, both are pretty bad deals.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: