Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In terms of accessibility, I don't think it should be a blocker on such tools existing

I think that we should solve for the former (which is arguably much easier and cheaper to do) before the latter (which is barely even studied).



Not certain which two things you're referring to by former/latter:

"solve [data privacy] before [solving accessibility of LLM-based therapy tools]": I agree - the former seems a more pressing issue and should be addressed with strong data protection regulation. We shouldn't allow therapy chatbot logs to be accessed by police and used as evidence in a crime.

"solve [accessibility of LLM-based therapy tools] before [such tools existing]": It should be a goal to improve further, but I don't think it makes much sense to prohibit the tools based on this factor when the existing alternative is typically less accessible.

"solve [barriers to LLM-based therapy tools] before [barriers to human therapy]": I don't think blocking progress on the latter would make the former happen any faster. If anything I think these would complement each other, like with a hybrid therapy approach.

"solve [barriers to human therapy] before [barriers to LLM-based therapy tools]": As above I don't think blocking progress on the latter would make the former happen any faster. I also don't think barriers to human therapy are easily solvable, particularly since some of it is psychological (social anxiety, or "not wanting to be a burden").




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: