New study reveals how many people are using AI for companionship — and the results are surprising

Claude AI speech logo
(Image credit: Claude AI)

As AI has gotten smarter and more conversational, many would have you believe that people are turning en masse to chatbots for relationships, therapy and friendship. However, that doesn’t appear to be the case.

In a new report from Anthropic, the makers of Claude AI, they’ve revealed some key information on how people are using chatbots. Analyzing 4.5 million conversations, the AI company painted a picture of how people use them. Anthropic claims these conversations were fed through a system that inputs multiple layers of anonymity to avoid breaks in privacy.

While the research produces a long list of findings, the key thing to note is that just 2.9% of Claude AI interactions are emotive conversations. Companionship and roleplay relationships made up just 0.5%.

Anthropic found that, for the vast majority of people, their AI tool was mainly used for work tasks and content creation. Of those seeking affection-based conversations, 1.13% used it for coaching, and only 0.05% used it for romantic conversations.

This aligns with similar results to ChatGPT. A study by OpenAI and MIT found that a limited number of people use AI chatbots for any kind of emotional engagement. Just like with Anthropic, the vast majority of people on ChatGPT use it for work or content creation.

Should AI be performing these roles?

Claude on laptop

(Image credit: Future/NPowell)

Even in low numbers, there is a fierce debate over whether AI should be used for these roles.

“The emotional impacts of AI can be positive: having a highly intelligent, understanding assistant in your pocket can improve your mood and life in all sorts of ways,” Anthropic states in their research blog post.

“But AIs have in some cases demonstrated troubling behaviors, like encouraging unhealthy attachment, violating personal boundaries, and enabling delusional thinking.”

They are quick to point out that Claude isn’t designed for emotional support and connection, but that they wanted to analyse its ability to perform this task anyway.

In the analysis, Anthropic found that those who did use it typically dealt with deeper issues like mental health and loneliness. Others used it for coaching, aiming to better themselves in different skills or personality aspects.

The report offers a balanced assessment of the situation, showing that there can be success in this area, but also detailing the risks, especially where Claude AI rarely steps in and offers endless encouragement — a point that Anthropic acknowledges as a risky topic to address.

More from Tom's Guide

Category
Arrow
Arrow
Back to Laptops
Brand
Arrow
Processor
Arrow
RAM
Arrow
Storage Size
Arrow
Screen Size
Arrow
Colour
Arrow
Condition
Arrow
Screen Type
Arrow
Storage Type
Arrow
Price
Arrow
Any Price
Showing 10 of 130 deals
Filters
Arrow
Show more
Alex Hughes
AI Editor

Alex is the AI editor at TomsGuide. Dialed into all things artificial intelligence in the world right now, he knows the best chatbots, the weirdest AI image generators, and the ins and outs of one of tech’s biggest topics.

Before joining the Tom’s Guide team, Alex worked for the brands TechRadar and BBC Science Focus.

He was highly commended in the Specialist Writer category at the BSME's 2023 and was part of a team to win best podcast at the BSME's 2025.

In his time as a journalist, he has covered the latest in AI and robotics, broadband deals, the potential for alien life, the science of being slapped, and just about everything in between.

When he’s not trying to wrap his head around the latest AI whitepaper, Alex pretends to be a capable runner, cook, and climber.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.