Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> and even a bit of therapy

I’d be very careful with relying on gpt for anything health related; I’m not saying there can’t be benefits, just that the risks increase exponentially.



Risky vs what? Googling? Not doing anything? Waiting for a therapist? It’s extremely sensitive to human emotional dynamics. It is also extremely biased toward non violent communication, which is very hard for humans.


Agree, and for things like cognitive behavioral therapy, where the "rules" are well-known and well-represented in its training corpus, it's amazing.


Guys, you are really crazy. Please find a real therapist with experience.


In the context of mental health, telling people they are crazy and they need a real therapist, is generally a poor word choice, at least.


Personally I wouldn't use gpt as a therapist but I've seen enough bad or useless therapists in my time to say that it's worth a shot for most people, especially if you need help now


As risky as any other health related self help, plus the added risk of unreliability.

When GPT proves itself to be reliably beneficial, then therapists will use it or recommend it themselves. Until then it’s an experimental tool at best.


I would say self-help is quite unreliable already, more unreliable doesn’t make it much worse.

The authority argument is pointless. The therapist must value person’s wellbeing above their continued income for this to apply. In theory they should, but it would take a lot to convince me and I would want to know what’s the incentive behind such a recommendation. An to be clear, I’m not saying LLM can be your therapist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: