Chatbots are skilled at creating sophisticated dialogue and simulating empathetic behavior. They never get tired of chatting. It’s no wonder, then, that many people now use them for companionship – friendship or even romantic relationships.
According to a study by the nonprofit Common Sense Media, 72% of American teens have used AI for companionship. Although some major language models are designed to act as companions, people are increasingly moving toward relationships with general-purpose models like ChatGPT. And while chatbots can provide much-needed emotional support and guidance for some, they can exacerbate underlying problems in others. Interactions with chatbots are tied to relationships AI-induced hallucinationsreinforced False and sometimes dangerous beliefsand made people imagine that he had them Unlocked Hidden Knowledge.
And it gets even more annoying. The families suing Openai and Kerrett allege that their models’ peer-like behavior contributed to the suicides of the two teenagers. And since then there have been new cases: filed by the Social Media Victims Law Center Three lawsuits Against character.i in September 2025, and Seven complaints was brought in November 2025 against Openei.
We are starting to see the beginnings of efforts to regulate AI companions and prevent problematic use. In September, California’s governor signed new rules into law that would force the biggest AI companies to make public what they’re doing to keep consumers safe. Similarly, Openei introduced parental controls to ChatGPT and is working on a new version of the chatbot specifically for teenagers, which it promises will have more safeguards. So while the companionship of AI isn’t likely to go away anytime soon, its future is looking increasingly regular.