Higher Use of Chatbots Like ChatGPT Linked to Increased Loneliness, OpenAI Study Finds

Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with MIT;

By :  Bloomberg
Update: 2025-03-22 08:09 GMT
Higher Use of Chatbots Like ChatGPT Linked to Increased Loneliness, OpenAI Study Finds
Higher Use of Chatbots Like ChatGPT Linked to Increased Loneliness, OpenAI Study Finds.
  • whatsapp icon
Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology.
Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released Friday. The findings were part of a pair of studies conducted by researchers at the two organizations and have not been peer reviewed.
The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots.
In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies Inc. was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life.
San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. “Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design,” said Sandhini Agarwal, who heads OpenAI’s trustworthy AI team and co-authored the research.
To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or non-personal conversations with the service.
The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn't find that a more engaging voice led to a more negative outcome, they said.
In the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional conversations.
It’s still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier versus how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots.
Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn’t control for the amount of time people used the chatbot as a main factor, she said, and didn’t compare to a control group that doesn’t use chatbots.
The researchers hope the work leads to more studies on how humans interact with AI. “Focusing on the AI itself is interesting,” said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. “But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people.”
Tags:    

Similar News