Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the terrifying part: doctors do this too! I have an MD friend that told me she uses ChatGPT to retrieve dosing info. I asked her to please, please not do that.


Find good doctors. A solution doesn’t have to be perfect. A doctor doing better than regular joe with a computer is much higher as you can see in research around this topic


I have noticed that my doctor is getting busier and busier lately. I worry that cost cutting will have doctors so frantic that they are forced to rely on things like ChatGPT, and “find good doctors” will be an option only for an elite few.


I have a hunch that the whole "chat" interface is a brilliant but somewhat unintentional product design choice that has created this faux trust in LLM's to give back accurate information that others can get from drugs.com or Medline with a text search. This is a terrifying example, and please get her to test it out by second guessing the LLM and watching it flip flop.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: