The Illusion of AI Companionship for Young Males | The Tyee
The Illusion of AI Companionship for Young Males | The Tyee
The Illusion of AI Companionship for Young Males | The Tyee

For those growing up in the early 2000s, MSN Messenger’s SmarterChild was a popular chatbot — you could ask how its day was, tell it a joke or pretend to be their friend. SmarterChild was simply scripted and often predictable, offering a playful way for youth to simulate conversations online.
Many years later, chatbots have evolved significantly. Artificial-intelligence-driven companion chatbots, which are designed to simulate human conversations, are now the leading use case of AI technologies in 2025. Youth increasingly turn to online platforms such as Character.AI and Replika to find companionship. While many chatbots, such as ChatGPT, are often used for everyday tasks or information searches, companion chatbots are specifically designed to mimic personal relationships by simulating affection and adapting to the user’s personality. The sophistication of companion chatbots raises questions about how they shape youth emotional and social development.
Research from the Harvard Graduate School of Education points to early pressures on boys to conform to gendered norms of emotional and physical toughness. This can limit boys’ development of empathy and emotional literacy, contributing to isolation. Over time, isolation and loneliness may lead to depression, violence and even radicalization. For young boys navigating these pressures, companion chatbots offer a space for self-understanding and expression. They can rehearse difficult conversations or scenarios, articulate their emotions or seek reassurance. In an already strained mental health system, AI companionship offers low-barrier support. While these benefits matter, risks still persist.