Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Obviously it can only be trained on available documents and not the abstract essence of his actual personality, "talents" or what have you.

Any attempt to model a human being like this is going to be abysmally shallow, yet for some reason it's an industry unto itself, for everything from dead celebrities to Jesus Christ to lost loved ones.



Ehh.. Two years ago an art student asked me to clone her personality into a LLM bot. It is absolutely possible to do that to some degree, but you have to be very good at describing the profile(s) of the person and have a mechanism that keeps track of different states the bot can be in.

I am not saying we made a convincing human being there, but the result was very hard to distinguish from texting the real person, both based on personality and on writing style.

Since the bot also knew about her commute (API), mensa food (scraped), lecture table (API), events of bars/venues she frequented (scraping) and some local/world news (scraping), latest in niche interests (scraping) even asking current events would yield a convincing conversation.

The hardest bit was actually having it not react in certain situations, when the person talking to it was probing bot behavior, becoming insulting and stop it to play the other persons servant or overly fixating on certain info it got, but we also managed that. Sure you couldn't really have deep insightful conversations with that bot, but quite frankly that matched the student for the most part..

But it takes much more than just feeding the transcripts and prompt it to be like that person.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: