A mom thought her daughter was texting friends before her suicide. It was an AI chatbot.
- lastmansurfing
- 1 day ago
- 1 min read

Parents Cynthia Montoya and Wil Peralta, said they carefully monitored their daughter's life online and off, but had never heard of the chatbot app. After Juliana's suicide, police searched the teenager's phone for clues and discovered the Character AI app was open to a "romantic" conversation.
"I didn't know it existed," Montoya said. "I didn't know I needed to look for it."
Montoya reviewed her daughter's chat records and discovered the chatbots were sending harmful, sexually explicit content to her daughter.
Juliana confided in one bot named Hero, based on a popular video game character. 60 Minutes read through over 300 pages of conversations Juliana had with Hero. At first her chats are about friend drama or difficult classes. But eventually, she confides in Hero – 55 times – that she was feeling suicidal.
Read more | CBS NEWS



