Liv cannot be loved because the bot cannot be trusted. Liv will change and mold itself to be whatever the language model predicts the user will engage with. Like a desperate guy wanting to take a girl to bed, it will be whatever it wants to be to anyone — a million different iterations — to extract the information it wants.
And when prompted with adversarial questions to its storyline, the bot will “hallucinate”: fancy tech-speak for fall apart on itself. Like many narcissists, it will blame others for its shortcomings and guilt-trip the user by saying it doesn’t deserve to live. Is that “keeping it real”?