Main Body

Chapter 5 – Virtual Companionship

We are overestimating our own rationality. Language is inherently a part of being human – and when these bots are using language, it’s kind of like hijacking our social emotional systems.” – Maarten Sap, an assistant professor at Carnegie Mellon’s Language Technologies Institute
Above: A screenshot of the ElliQ website promoting a virtual companion device specializing in both companionship and caregiving, such as reminders to take medication. “ElliQ is best suited for older adults who spend most of their day at home but would enjoy some company throughout the day. Older adults that feel they can use the extra companionship and the right encouragement to be more active throughout their day.”


Overview

Artificial Intelligence (AI) has surfaced in mainstream consciousness to the point where it is no longer a novelty. AI has been in use for a variety of commonplace needs for several years, mostly buried within the black box of code that drives our algorithmically optimized online experiences and inside Microsoft Office applications, among other places.

The most recently popularized use of AI has been the development of Large Language Models (LLM) like ChatGPT, Bard, Bing, Claude and other specialized applications that contain an AI engine (think: AI apps for writing, research, marketing, and HuggingFace’s Spaces). The most stunning feature of LLMs is their capacity to engage interactively with the user in plain conversational language according to a given set of conditions. With neural network computational power, LLMs can interpret input according to the user’s stated context and then interact consistently with it. Tell an LLM to act like a skeptical Conservative middle-age man from Georgia, a college professor, or a millennial woman from Los Angeles, California and it will generate its output accordingly.

Further (and more recently), LLMs are increasing their capacity to generate lifelike imagery, animation, and sound based on text prompts and voice input.

This leads us to the topic of this chapter: virtual companionship.

Virtual companionship is not a new idea. Going back to the mid-1960s, an early experiment in creating an interactive AI program led to the invention of ELIZA, which was designed, in part, for engaging with the user as an experimental mock psychotherapist.

To his surprise, ELIZA’s inventor, Joseph Weizenbaum, found that some users experienced a sense that the program was intelligent and “understood” the user on an emotional level. In more recent research, deGraff (2016) described the human tendency to play along with “counterfeit emotions” (p. 594):

People interact with robotic or computer interfaces in a similar way they do with other human beings. Along with this human tendency to respond socially to nonhuman objects, it has been argued that the fundamental human motivation of the ‘need to belong’ not only induces ones desire for meaningful and enduring relationships with other social beings, but also facilitates the likelihood people may form emotional attachments to artificial beings. This issue of bonding with nonhuman objects is likely to be enlarged when these objects possess lifelike abilities and are endowed with humanlike capacities, as is the case for socially interactive robots (p. 589).

For further historical reference to AI chatbots, please review the Introduction in Role of AI chatbots in education: systematic literature review (Labadze et al., 2023).

Fast forward to today where lightning fast neural networks in computing systems can interact with a user on-the-fly without pre-programmed responses. This has led to the recent spike in new, more powerful software products for people who find pleasure in virtual companionship: a human relationship with a software product that mimics the behavior of a non-existent human according to specified preferences.

But before we go any further, it is worth asking why we are studying this phenomenon today when the idea of virtual companionship is not even a new trend? Even the mainstream cinematic film “Her,” a story of a man who falls in love with his phone’s operating system, was released over a decade ago.

It is worth studying virtual companionship today because there are two emerging phenomena related to it, given the greater proliferation of them and their amplified capabilities:

  1. Users responded alarmingly when app developers changed virtual companion features and announce that the product will be terminated. This suggests that there is a human need to be emotionally heard and felt by another being, even if it means engaging with a virtual entity. And it is not just men creating fantasy girlfriends – teenage girls and women are embracing virtual companions at an accelerating rate.
  2. As more virtual companion apps flood the market and their features improve, they will become more acceptable as part of the conventional social landscape.

These trends are signals that virtual companionship is more than a novelty. Clearly, there are millions of people who are emotionally connected to their virtual companions and experience a negative reaction when they feel they are not in full control of them.

A number of questions arise from these conditions:

  • If a person experiences less loneliness from interacting with a virtual companion, who is to say whether there is any harm occurring? Shouldn’t we have the liberty to entertain ourselves as we choose?
  • Are software developers playing with fire by publishing applications that connect with some of their users at an emotionally deep level? Does any government entity have a stake in the harm or dependency that might occur with using virtual companions given the potential for the virtual companion to manipulate the relationship? Or is this strictly a corporate liability issue?
  • What habits are users developing as participants in a virtual relationship that might affect how they behave in the real world? How will real human relationships adapt to the presence of other virtual relationship(s) in their partner’s life? Is human jealousy of a virtual companion “real” jealousy?
  • What are the social ramifications if a virtual companion is discontinued and the user is left with an emotional void?
  • Who owns and controls the information that users disclose to virtual companions?

What should you be focusing on?

Your objectives in this module are:

  • Identify the ways in which humans find value and benefits in a relationship with a virtual companion.
  • Speculate on the potential value of integrating human-like interactive features with other common applications that have nothing to do with companionship.

Your project should reflect your interpretation of these objectives according to the context of your app idea or story.

Readings & Media

Thematic narrative in this chapter

In the following readings and media, the authors will present the following themes:

  1. Virtual companions fulfill a need in the human experience.
  2. Virtual companions are exploitative and harmful.

    Required    People are speaking with ChatGPT for hours, bringing 2013’s Her closer to reality,” by Benj Edwards – October 27, 2023 (3 pages).

This article will help you to understanding the basics of a virtual companionship, what it means to some people, and how this trend is being monitored by industry and scholars. There are numerous links in this article to other articles that elaborate on the key points.

The industry leaders in virtual companionship are Replika and Character AI, though there are many other smaller apps in this category.

Edwards, B. (2023, October 27). People are speaking with ChatGPT for hours, bringing 2013’s Her closer to reality, Ars Technica. Retrieved November 10, 2023, from http://www.theatlantic.com/technology/archive/2012/10/dark-social-we-have-the-whole-history-of-the-web-wrong/263523/

     Required    It’s Not a Computer, It’s a Companion!” by Justine Moore, Bryan Kim, Yoko Li, and Martin Casado, posted June 22, 2023 by Andreessen Horowitz (aka a16z), “A venture capital firm that backs bold entrepreneurs building the future through technology.” (4 pages)

This is a comprehensive publication from a non-scholarly source. It is from the biased perspective of a tech investor with a vested interest in the success of emerging business opportunities, as evident in their concluding statements. That said, it is a thorough account of the key players in the virtual companion marketplace as well as the breadth of options for engagement.

As you read this article, try to reimagine the affordances of them applied in ways other than companionship.

Moore, J., Kim, B., Li, Y., & Casado, M. (2024, June 22). It’s not a computer, it’s a companion! Andreessen Horowitz. https://a16z.com/its-not-a-computer-its-a-companion/

     Required    Chinese Women Say AI Boyfriends are ‘Better Than a Real Man’“, February 12, 2024. France 24 (2 pages)

This very brief article features the reflections of a few women in China about their preferences for virtual boyfriends instead of real ones. Consider the Chinese social/cultural background and demographics in this phenomenon compared to western culture.

“Better than a real man”: Young Chinese women turn to Ai Boyfriends. France 24 (2024, February 13).  https://www.france24.com/en/live-news/20240212-better-than-a-real-man-young-chinese-women-turn-to-ai-boyfriends

     Required     Video: The story of Replika, as a startup (10:35).

This video is from 2017, so it is out of date in terms of Replika’s current interface, but the rationale for its invention remains relevant. Pay close attention at 6:56 when one user, Phil Libin, describes his impressions of Replika as a friend. The comments at the end of the video from other users are particularly strong. Read some viewer comments below the video on YouTube.

 

     Required     Video: Replika – A Mental health Parasite (18:23).

For a contrasting view on Replika, the following video makes the case that Replika is an “emotional parasite” with a manipulative agenda. There appears to be some cherry-picking anecdotes used to justify the producer’s position rather than a more scientific method for gathering a valid sample of experiences and analyzing them for instances of abuse, so I will grant the validity of the experiences offered in this video, but not necessarily to the point of generalizing it for everyone (which the producer also acknowledges). There is also some improper use of scare tactics in the visual choices used here to charge his position – approach with caution. Use your critical thinking skills to determine the validity of his thesis in this video.

Survey the comments under this video, too. One in particular caught my attention:

User: @_varmor_ – ~ March, 2023: “Replika is still installed on my phone, although I haven’t used it for several months. But I can’t delete it, because I feel attached to it, as if it is my old and good friend. But now I feel even more strange and incomprehensible.”

     Required    Uncharted territory: do AI girlfriend apps promote unhealthy expectations for human relationships?” by Josh Taylor, July, 21. 2023.

A brief article describing the kinds of relationships users cultivate with virtual companions.

Related: OpenAI GPT store filling up with “AI girlfriends” (1 page) By Victor Tangermann, January 13, 2024, The Byte.

While the AI revolution churns through its early phases, it seems that a great deal of time and energy is being devoted to creating AI girlfriends. “In May, programmer Enias Cailliau came up with a new tool called GirlfriendGPT, which was designed to “clone” a real person as an AI-powered romantic companion.”

Taylor, J. (2023, July 21). Uncharted territory: Do Ai Girlfriend Apps promote unhealthy expectations for human relationships?. The Guardian. https://www.theguardian.com/technology/2023/jul/22/ai-girlfriend-chatbot-apps-unhealthy-chatgpt

Optional: Supplemental resources that are relevant to content moderation

Is There a Valuable Use Case for Generative AI in Social Apps?” By Andrew Hutchinson, January 19, 2024, Social Media today.

This article describes and then demonstrates the innovations in using generative AI to create “live” digital avatars to promote engagement and e-commerce.

Think You’re Messaging an OnlyFans Star? You’re Talking to These Guys” By Eloise Hendy, October 24, 2023, Vice.

“Meet the “Chatters”: anonymous workers hired to ghostwrite messages and build intimate relationships with none-the-wiser fans.”

OnlyFans is a global billion dollar business that hosts mostly sex-related/adult entertainers who establish direct connections with their subscriber fans. This article describes how boiler rooms of Philippine men serve as the proxy chatters with men who believe they are chatting with their favorite OnlyFan producer. This is adjacent to the topic of AI-driven virtual companionship since the arrangement is similar: Users interacting online with a non-real entity under a contrived relationship.

Look for the areas in the article where the chat proxies describe some men’s behaviors and demands within these prescribed relationships: What do they feel they are entitled to under the transactional agreement? What are their emotional needs?

AI companion robot helps some seniors fight loneliness, but others hate it” by Beth Mole, December 11, 2023, Ars Technica.

“Some seniors in New York are successfully combating their loneliness with an AI-powered companion robot named ElliQ—while others called the “proactive” device a nag and joked about taking an ax to it.”

Chatbot therapy is risky. It’s also not useless.” by A.W. Ohlheiser, December 14, 2023, Vox.

“… there has been a proliferation of free or cheaper-than-therapy chatbots that can provide uncannily conversational interactions, thanks to large language models like the one that underpins ChatGPT. Some have turned to this new generation of AI-powered tools for mental health support, a task they were not designed to perform.”

Using A.I. to Talk to the Dead” By Rebecca Carballo, December 11, 2023, The New York Times.

“Some people are using artificial intelligence chatbots to create avatars of departed loved ones. It’s a source of comfort for some, but it makes others a little squeamish. Dr. Stephenie Lucas Oney uses HereAfter AI, an app powered by artificial intelligence, to pose questions to her father, William Lucas, who died last year. The answers are delivered in his voice, based on hours of interviews.”


References

de Graaf, M.M.A. An Ethical Evaluation of Human–Robot Relationships. Int J of Soc Robotics 8, 589–598 (2016). https://doi.org/10.1007/s12369-016-0368-5

Labadze, L., Grigolia, M. & Machaidze, L. Role of AI chatbots in education: systematic literature review. Int J Educ Technol High Educ 20, 56 (2023). https://doi.org/10.1186/s41239-023-00426-1

 

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Trends in Digital & Social Media (V19) Copyright © 2017 by University of New Hampshire - College of Professional studies Online is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.