Posted on Categories Discover Magazine
In May 2023, U.S. Surgeon General Vivek H. Murthy issued a public health advisory calling loneliness and isolation a public health crisis that poses serious health risks, including increased risk for cardiovascular disease, dementia, stroke, depression, anxiety, and premature death. “The mortality impact of being socially disconnected is similar to that caused by smoking up to 15 cigarettes a day and even greater than that associated with obesity and physical inactivity,” he wrote.
Loneliness is dangerous. Could AI friends help?
AI friends, also called AI companions or social chatbots, are essentially chatbots programmed to provide companionship to users. They’re based on the same basic technology — machine learning and natural language processing — as ChatGPT but are programmed specifically to provide friendship.
Several companies offer AI friends, including Nomi, Kindroid, Replika, and Character.ai. While the process and the product vary from company to company, in general, you sign up on the site, choose a few characteristics you’d like in your AI friend, and then communicate with your new buddy by text, some apps, or even via voice calls.
An AI friend can encourage you when you’re down, remember things you’ve told it, and laugh at your jokes. Or, as the home page of Replika, one of the popular providers of AI friends, puts it, “Your Replika will always be by your side no matter what you’re up to.”
Read More: Can AI Read Your Mind?
What does science have to say about AI friends and loneliness? Not much. A 2020 study found that AI companions might be useful in providing some types of emotional support. A 2024 study of 1,006 Replika users found that three percent of users reported that the app had halted their suicidal ideation. But the technology is too new for there to have been enough research on it to offer any real guidance.
However, there is abundant research on friendship. According to a 2022 paper in the journal Human Communication Research, friendship is typically both voluntary and reciprocal and involves a relationship between people who see themselves more or less as equals. The chatbot’s involvement is clearly not voluntary, and it’s hard to see how a bot could be considered equal.
Read More: Healthy, High-Quality Relationships Matter More Than We Think
Jeffrey Hall is director of the Relationships and Technology Lab at the University of Kansas and has been studying friendship and technology for more than a decade. Friendship, Hall explains, is a special type of relationship in which you’re valued not because you’re necessary, as a parent is to a child, for example, but because you’re you. But a chatbot doesn’t choose you for a friend because you’re you. You choose it because you can, to some degree, design it to be what you want it to be.
Still, social chatbots do have a lot of the characteristics we look for in friends, says Hall. For one thing, they’re always supportive and interested in your life. “In some sense, I cannot deny that many of the communicative elements of what it means to be a good conversation partner and a good friend are present in a chatbot interaction,” he says.
Though Hall admits those interactions can potentially be helpful, his concerns center on the very people who need friends the most: people who are socially isolated. Loneliness is one of Hall’s areas of expertise, and he points out that one of the best things you can do to prevent loneliness is to develop meaningful face-to-face relationships. You can’t do that very well with a chatbot, at least not yet.
Read More: AI Isn’t the Problem, We Are
Avoiding loneliness doesn’t mean just having a friend; it means being a friend. It means serving, not just being served. One of the recommendations in the Surgeon General’s report is to “seek out opportunities to serve and support others, either by helping your family, co-workers, friends, or strangers in your community or by participating in community service.” Another is to “participate in social and community groups such as fitness, religious, hobby, professional, and community service organizations to foster a sense of belonging, meaning, and purpose.”
Those aren’t things you can do with a chatbot.
Read More: How Loneliness Can Impact a Person’s Health and Wellbeing
Our writers at Discovermagazine.com use peer-reviewed studies and high-quality sources for our articles, and our editors review for scientific accuracy and editorial standards. Review the sources used below for this article:
Avery Hurt is a freelance science journalist. In addition to writing for Discover, she writes regularly for a variety of outlets, both print and online, including National Geographic, Science News Explores, Medscape, and WebMD. She’s the author of “Bullet With Your Name on It: What You Will Probably Die From and What You Can Do About It,” Clerisy Press 2007, as well as several books for young readers. Avery got her start in journalism while attending university, writing for the school newspaper and editing the student non-fiction magazine. Though she writes about all areas of science, she is particularly interested in neuroscience, the science of consciousness, and AI–interests she developed while earning a degree in philosophy.