New report shines a light on how teenagers are using AI companions
Bangor University researchers have shone a light on how teenagers are using AI companions.
Their report looked at young people who are integrating AI chatbots such as ChatGPT, and Character.ai into their daily lives, providing an in-depth analysis of behaviours, motivations and perceptions.
They examined usage patterns, levels of trust, satisfaction and the social impact of these emerging technologies on teenagers, aged 13-18 years old.
It was found that most teenagers (56%) believe that AI systems used for companionship can think or understand. Crucially, however, 77% believe it is false that AI companions can “feel”.
The report, published by the Emotional AI Lab, and funded by Responsible AI UK, has shown that AI companions are already a meaningful presence in the lives of many UK teenagers.
The researchers say for most users, these systems do not appear to be replacing human friendships nor causing widespread harm. Instead, they seem to be used pragmatically, such as for advice, curiosity, entertainment and for a significant minority, as a confidant for serious matters.
At the same time, a small but still very significant group of teenagers are engaging more intensely with AI companions, which is significant given the deaths associated with Open AI’s ChatGPT and Character.ai.
The data shows that 96% of respondents have used at least one of the survey’s 31 listed applications as an AI companion.
The representative survey data of 1,009 teenage users shows that 53% of respondents express moderate to complete trust in the information and advice they receive from AI companions. In contrast, only 13% express moderate or complete distrust, and 34% are neutral.
When comparing conversations with AI to those with real-life friends 44% of respondents finds conversations with AI companions to be a little or a lot less satisfying. Almost a third (32%) report them as more satisfying, while 24% find the experiences to be about the same.
Over two thirds (67%) of respondents feel that using AI companions is “not affecting your human friendships at all”, while a quarter (26%) believe AI is actually “helping you make more human friendships”, potentially by allowing practice of social skills. However, a small minority (7%) believes that AI is “replacing some of your human friendships.”
A significant finding from the survey is the extent to which teenage users are turning to AI for important discussions. While 44% state they would never choose an AI over a real person for a serious matter, 52% of respondents have confided in an AI companion about something important or serious at least once.
It was found that users value the freedom to express themselves without fear of criticism or social repercussions and that they perceive AI as a safe outlet for thoughts they might not share with peers. The 24/7 accessibility of AI companions is a significant attraction.
Andrew McStay Professor in Technology & Society at Bangor University and Director of the Emotional AI Lab said, “AI systems are now uncannily clever. Whereas only a few years ago chatbots and voice assistants never seemed to ‘get’ what people meant, today’s AI systems are fluent, persuasive, and at times humanlike – even seeming to empathise. Though most teenagers do not believe AI companions can feel, a majority do believe they can think or understand. This distinction matters. Our results show that they are attributing mind-like properties and treating AI companions as intentional agents. This perception helps explain why teens trust AI advice, disclose personal information, and sometimes experience AI interactions as more satisfying than conversations with friends. It also underlines the importance of emulated empathy.”
Vian Bakir, Professor of Political Communication and Journalism at Bangor University said, "Overall, the findings support an evidence-based governance approach. We support efforts to prevent extreme cases reported from the USA, but we also caution against defaulting to moral panic. We say this not to weaken efforts to prevent harm, but to ensure that the broader finding that AI companions are relational technologies is not missed. There is a broader need to understand that teens are living in a world where dominant media-technologies are by default empathic; and the impact of this environment is not yet known. Continued longitudinal research and policy surveillance will be needed as systems and usage evolve, but the present evidence is clear. Teenagers are in relationships with AI systems and most of them believe that the answer is, yes, AI systems used for companionship understand them.”