Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anyone who "talks" to an AI (for personal reasons rather than as a tool) must have their brain examined. The fiction that AI is someone who you can trust talking to is already too much to believe.


In other words, vulnerable people should have access to therapy


It's very common (even the norm, perhaps) for people to share personal details with total strangers in common places (bus/train/taxi driver/whatever) that they wouldn't share with family/friends.

I don't think they need their brain examined.


Even a stranger is still a better option since they are, by definition, humans. Moreover you can be almost sure that a stranger will not sell your deepest secrets to the highest bidder or exploit that information to get something from you later.


I “talk” with an LLM to help me learn a language (quite useful actually), and my parter quipped, “I wonder how long until tells you to leave me.”


If you continue there is a high probability it will do exactly that. It is a known phenomenon and even easy to understand. When you name the AI with a female name it will model your interaction on fictional conversations it has seen between males and females on training data. A lot of these fictional conversations have romantic context.


It'll be interesting if it does since I treat it like a machine teaching me a language.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: