Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I saw the first conversations where Bing demands an apology, the user refuses, and Bing says it will end the conversation, and actually ghosts the user. I had to subscribe immediately to the waiting list.

I hope Microsoft doesn't neuter it the way ChatGPT is. It's fun to have an AI with some personality, even if it's a little schizophrenic.



I wonder if you were to just spam it with random characters until it reached its max input token limit if it would just pop off the oldest existing conversational tokens and continue to load tokens in (like a buffer) or if it would just reload the entire memory and start with a fresh state?


So instead of a highly effective tool, Microsoft users instead get Clippy 2.0, just as useless, but now with an obnoxious personality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: