Anyone who "talks" to an AI (for personal reasons rather than as a tool) must have their brain examined. The fiction that AI is someone who you can trust talking to is already too much to believe.
It's very common (even the norm, perhaps) for people to share personal details with total strangers in common places (bus/train/taxi driver/whatever) that they wouldn't share with family/friends.
Even a stranger is still a better option since they are, by definition, humans. Moreover you can be almost sure that a stranger will not sell your deepest secrets to the highest bidder or exploit that information to get something from you later.
If you continue there is a high probability it will do exactly that. It is a known phenomenon and even easy to understand. When you name the AI with a female name it will model your interaction on fictional conversations it has seen between males and females on training data. A lot of these fictional conversations have romantic context.