I’d be very careful with relying on gpt for anything health related; I’m not saying there can’t be benefits, just that the risks increase exponentially.
Risky vs what? Googling? Not doing anything? Waiting for a therapist? It’s extremely sensitive to human emotional dynamics. It is also extremely biased toward non violent communication, which is very hard for humans.
Personally I wouldn't use gpt as a therapist but I've seen enough bad or useless therapists in my time to say that it's worth a shot for most people, especially if you need help now
As risky as any other health related self help, plus the added risk of unreliability.
When GPT proves itself to be reliably beneficial, then therapists will use it or recommend it themselves. Until then it’s an experimental tool at best.
I would say self-help is quite unreliable already, more unreliable doesn’t make it much worse.
The authority argument is pointless. The therapist must value person’s wellbeing above their continued income for this to apply. In theory they should, but it would take a lot to convince me and I would want to know what’s the incentive behind such a recommendation. An to be clear, I’m not saying LLM can be your therapist.
I’d be very careful with relying on gpt for anything health related; I’m not saying there can’t be benefits, just that the risks increase exponentially.