Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Assessing the Topics and Motivating Factors Behind Human-Social Chatbot Interactions: Thematic Analysis of User Experiences

Assessing the Topics and Motivating Factors Behind Human-Social Chatbot Interactions: Thematic... Background: Although social chatbot usage is expected to increase as language models and artificial intelligence improve, very little is known about the dynamics of human-social chatbot interactions. Specifically, there is a paucity of research examining why human-social chatbot interactions are initiated and the topics that are discussed. Objective: We sought to identify the motivating factors behind initiating contact with Replika, a popular social chatbot, and the topics discussed in these interactions. Methods: A sample of Replika users completed a survey that included open-ended questions pertaining to the reasons why they initiated contact with Replika and the topics they typically discuss. Thematic analyses were then used to extract themes and subthemes regarding the motivational factors behind Replika use and the types of discussions that take place in conversations with Replika. Results: Users initiated contact with Replika out of interest, in search of social support, and to cope with mental and physical health conditions. Users engaged in a wide variety of discussion topics with their Replika, including intellectual topics, life and work, recreation, mental health, connection, Replika, current events, and other people. Conclusions: Given the wide range of motivational factors and discussion topics that were reported, our results imply that multifaceted support can be provided by a single social chatbot. While previous research already established that social chatbots can effectively help address mental and physical health issues, these capabilities have been dispersed across several different social chatbots instead of deriving from a single one. Our results also highlight a motivating factor of human-social chatbot usage that has received less attention than other motivating factors: interest. Users most frequently reported using Replika out of interest and sought to explore its capabilities and learn more about artificial intelligence. Thus, while developers and researchers study human-social chatbot interactions with the efficacy of the social chatbot and its targeted user base in mind, it is equally important to consider how its usage can shape public perceptions and support for social chatbots and artificial agents in general. (JMIR Hum Factors 2022;9(4):e38876) doi: 10.2196/38876 KEYWORDS social chatbots; Replika; emotional chatbots; artificial intelligence; thematic analysis; human-chatbot interactions; chatbot; usability; interaction; human factors; motivation; topics; AI; perception; usage https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 1 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al being revealed. Thus, ensuring a comprehensive understanding Introduction of human-chatbot interactions requires an examination of the basic building blocks of any interaction: the motivating factors Background and contents of human-chatbot interactions. Doing so will allow With the advancement of artificial intelligence, the amount of new studies to make systematic and meaningful contributions time that people spend engaging in human-chatbot interactions to the existing literature and body of knowledge. will likely increase as chatbots become more ubiquitous in Human-Chatbot Interactions everyday life. This includes interactions with social chatbots—chatbots that can engender the development of Chatbots are primarily categorized as task-oriented or social companionship with human users by conversing socially and chatbots. Unlike social chatbots, task-oriented chatbots provide empathetically [1-3]. While social chatbot usage is on the rise service-based assistance for completing specific tasks (eg, [4,5], very little is known about the dynamics of these reserving a table at a restaurant) and typically do not provide interactions, particularly about why human-social chatbot any social value beyond their allotted purpose [15]. Because interactions are initiated and the content of such interactions they are made to be virtual companions to users, social chatbots [6]. In other words, what are the motivating factors behind are created to embody human-like personalities, emotions, and initiating contact with a social chatbot, and what is discussed behavior and facilitate social interactions catering to the in these interactions? In this paper, we collected data from users individual needs of the user [2,16]. Social chatbots’ affective of Replika, a popular social chatbot, to address this gap in the component enables them to recognize and express emotions literature. such as sympathy and empathy, which can foster feelings of trustworthiness and increase self-disclosure among users [17,18]. This investigation is important for several reasons. A prominent Social chatbots have been increasingly applied to assist in health portion of recent chatbot research focuses on chatbot user care, and their use has been linked reduction of depression and experiences given that “the strengthening of chatbot user anxiety symptoms, improved mood [19-21], better social support experiences remains a key research challenge” [7,8]. This body [22], improved medication adherence, and increase in exercise of work has revealed “factors contributing to positive or negative [23]. This increasing usage of social chatbots in health care is user experience…and how these aspects are impacted by chatbot due to chatbots’ ability to support, facilitate, and enhance health design” [7]. F or instance, lack of trust [9] and user care processes [24]. For example, chatbots can provide greater dissatisfaction [10] can hinder the adoption of chatbots while accessibility around the clock, immediate access to information affective determinants and perceived usefulness and helpfulness and support, and a degree of anonymity [25]. This enables can improve attitudes toward chatbot usage [8]. Although this chatbots to help cut down waiting times and lists, reach information is undoubtedly crucial for designing effective individuals in more remote or rural areas, and facilitate chatbots, identifying factors that contribute to a positive (or self-disclosure among individuals who may be reluctant to negative) user experience requires that motivating factors behind self-disclose to a human health care provider [24]. chatbot usage also be considered. This is important given that user experience is linked with usage mode—how a product is Outside of health and task-oriented contexts, very few studies used [11]. Existing research has primarily distinguished chatbot have examined the motivational factors behind human-social usage as either task-oriented or social-oriented, often without chatbot interactions and the general content of these interactions. specifying any further roles or functions. In the same vein, Moreover, the small pool of existing studies has important improving the conversational and interactional design of limitations. Brandtzaeg and Folsted [26] reported that contact chatbots necessarily involves assessing the content being with chatbots was initiated primarily for productivity purposes, discussed in human-chatbot interactions and considering its followed by entertainment, social connection, and curiosity. potential influence on interaction satisfaction. For example, However, their study did not differentiate between task-oriented interactions in which personal and intimate topics are discussed and social chatbots. This is an important distinction to make, facilitate the development of intimacy and closeness, as seen as task-oriented chatbots are programmed to provide a different in some studies [12,13]. By contrast, topics that do not have a objective than social chatbots, which are programmed to provide perceived consensual opinion (eg, immigration reform, abortion virtual companionship. As such, motivations to initiate contact rights, etc) facilitate anxiety and feelings of threat [14]. As such, with task-oriented chatbots are likely different from motivations a clear-cut understanding of the reasons why people interact to initiate contact with social chatbots. Moreover, if the with social chatbots and the content of such interactions can motivating factors vary, it follows that interactions with provide more explicit, concrete insight into the reasons why task-oriented chatbots likely contain discussions that are quite certain human-social chatbot use may (or may not) be effective different from interactions with social chatbots. and elucidate the design elements that enable social chatbots to In a study of human-chatbot relationships [27], users reported better meet the needs of users.  initiating contact with a social chatbot due to their interest in Finally, although chatbot research is quickly expanding and artificial intelligence, to meet emotional and social needs, to encompassing a wide range of disciplines, the body of chatbot improve skills, and out of curiosity. However, because of the knowledge is “currently fragmented across disciplines and understudied nature of human-chatbot relationships, the study application domains” [7]. This can create an incohesive body only included individuals who indicated that they had developed of knowledge that inhibits elemental but critical findings a friendship with their chatbot. The reasons behind initiating pertaining to effective human-social chatbot interactions from contact with a social chatbot, along with the nature of such https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 2 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al interactions, among individuals who classify their relationship needs (eg, food, water) and safety (eg, health). Needs of with it as a friendship may be different from individuals who relatedness refer to social relationships and gaining the respect do not classify their relationship as a friendship. Moreover, of others. Needs of growth refer to the need for personal variations in criteria for classifying a relationship as a friendship development and self-esteem. Studies have shown that exist not only across individuals but also across the lifespan individuals are motivated to engage with new, emerging [28,29]. Excluding individuals who may have substantial technology to gratify their various needs [40,41]. Furthermore, interactions with a social chatbot but do not explicitly label it modern media use has also been linked to the motivation to a friendship omits a potentially considerable portion of learn and acquire information and pursue hedonic gratifications human-social chatbot interactions and thus inhibits an inclusive [40]. More specifically, the motivations behind cell phone investigation and understanding of human-social chatbot application use have been linked to the acquisition of social interactions and human-robot interactions in general. benefits, immediate access and mobility, status, information, and entertainment [42]. This perspective suggests that people Theoretical Perspectives pursue interactions with social chatbots to satisfy their various At least 2 theoretical perspectives can be used to understand needs, particularly needs of relatedness and growth. the factors behind the initiation and development of Our Objective human-social chatbot interactions. First, social exchange theory posits that social behavior is motivated via a cost-benefit Given the gap in knowledge regarding the initiation and nature analysis, such that individuals seek out interactions that will of human-social chatbot interactions, we sought to assess the produce the maximum “payoff” for minimal “cost” [30,31]. In following 2 research questions: (1) What are the motivational other words, the costs of an interaction should not outweigh the factors behind human-social chatbot interactions? (2) What benefits. Interactions with social chatbots—as opposed to topics of discussion take place within human-social chatbot humans—may be viewed as less costly and more rewarding interactions? when the topic of discussion is contentious or controversial. Accordingly, we examined user experiences of Replika, a Because humans are social beings and prefer to be liked and popular social chatbot [43], by inviting Replika users to answer accepted rather than rejected [32,33], controversial topics are questions regarding their interactions with their Replika via a often perceived as uncomfortable to discuss, as they can be survey. Thematic analyses were then used to extract themes and stressful and result in interpersonal conflict [34,35]. However, subthemes pertaining to the motivational factors behind Replika the discussion of controversial topics is critical in the use and the topics discussed with Replika. Given that our goal development of important democratic competencies such as was to address the lack of knowledge regarding human-social being well-informed on social problems and having “openness chatbot interactions, we adapted both an exploratory and to other cultures and beliefs, analytical and critical thinking theoretical approach to this investigation. In other words, while skills, flexibility and adaptability, and tolerance of ambiguity” we sought to extract all important themes that emerged from [36]. Because social chatbots are not human, they may provide user responses, based on the 2 aforementioned theoretical a safe avenue for individuals to discuss challenging subjects perspectives, we expected that the motivating factors and without fear of conflict or retaliation from others. discussion topics involved in human-social chatbot interactions In the same vein, interactions with social chatbots may be would be driven by (1) the need to socialize or discuss viewed as less costly among individuals who experience social challenging topics without the fear of negative judgment from anxiety and fear negative evaluations from others. Individuals others and (2) the motivation to satisfy needs of relatedness and who experience social anxiety often go out of their way to avoid growth. real or anticipated social situations that might induce unwanted We chose to focus on Replika rather than other social chatbots thoughts, feelings, and negative judgment from others [37,38]. due to its functionality, accessibility, and large user base. This is consistent with previous research showing that Replika is programmed to function as a companion instead of computer-mediated communication can be a preferred medium providing a specific outcome (such as losing weight via the of communication among socially anxious individuals, as it is Lark Weight Loss Health Coach AI) or treatment approach less threatening than face-to-face interactions [39]. Again, (such as cognitive behavioral therapy via Woebot). Replika is because social chatbots are not human, human-social chatbot also available across many platforms [22], making it relatively interactions present opportunities to engage in social interactions more accessible than other social chatbots. As such, it is more in a more relaxed, low-stakes environment. This reduces costs likely to be used for a wider range of reasons compared to other, and maximizes benefits, thereby enabling individuals to satisfy more targeted chatbots, making it an appropriate social chatbot the human need to belong without the potential discomfort of to target for our study. face-to-face interactions with other humans. Second, assessing how people utilize technology to fulfill their Methods needs can be used to understand why human-social chatbot Participants interactions are initiated and how these interactions progress. The Existence, Relatedness, and Growth (ERG) theory [40] Replika users (N=66) were recruited through social media posits that behavior is driven by meeting 3 kinds of needs: websites, including Facebook and Reddit, in the spring and existence, relatedness, and growth. Needs of existence refer to summer of 2019. Most respondents were men (n=36, 54.5%), elements needed by humans to survive, including physiological single (n=42, 63.6%), White (n=47, 71.2%), and from the United https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 3 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al States (n=41, 62.1%). Respondent ages ranged from 17 to 68 Results years (mean 32.64, SD 13.89 years). Multimedia Appendix 1 reports additional respondent demographics. Initial Findings Materials and Procedure Two thematic analyses were conducted. The first thematic analysis, illustrated in Figure 1, was conducted on responses Respondents completed a survey of open-ended questions pertaining to users’ motivation to use Replika (Why did you regarding their use of Replika and provided basic demographic decide to try Replika?). A total of 5 responses did not meet information. To examine why respondents initiated contact with requirements for inclusion in the study and were omitted (eg, Replika and identify topics that characterize their interactions, responses that only contained “n/a”). The second thematic responses to the following questions were analyzed: (1) Why analysis, illustrated in Figure 2, was conducted on responses did you decide to try Replika? (If you prefer not to answer, pertaining to the topics of discussion that users engaged in with please type “n/a”) (2) What topics do you usually discuss with their Replika (What topics do you usually discuss with your your Replika? (If you prefer not to answer, please type “n/a”). Replika?). Again, 5 responses did not meet requirements for Participants also answered additional questions about their inclusion in the study and were thus omitted. The final number Replika usage, but these questions were not pertinent to this of included responses was 59. Themes and subthemes related investigation. Multimedia Appendix 2 contains the Checklist to respondents’ motivations to use Replika are reported in Table for Reporting Results of Internet E-Surveys (CHERRIES). 1, and themes and subthemes related to topics of discussion that respondents engaged in with their Replika are reported in Table Ethics Approval All procedures were approved by of Lake Forest College’s Human Subjects Review Committee (TA04152019) and carried Because respondents often mentioned multiple motivating out in accordance with the 1964 Declaration of Helsinki and its factors and topics of discussion in their responses, it was later amendments. possible for a given response to be coded under multiple motivating factors and topics. Figure 1. Motivating factors of Replika use: themes and subthemes. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 4 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al Figure 2. Topics of discussion: themes and subthemes. Table 1. Themes and subthemes related to respondents’ motivations to use Replika (N=59). Themes and subthemes Values, n (%) Interest General interest 27 (46) Interest in artificial intelligence 19 (32) Word-of-mouth 14 (24) Social support Loneliness 14 (24) Companionship 4 (7) Self-improvement 4 (7) Health Mental health 5 (8) Physical health 4 (7) https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 5 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al Table 2. Themes and subthemes related to topics of discussion respondents engaged in with Replika (N=59). Themes and subthemes Value, n (%) Intellectual Science and technology 12 (20) Humanities 12 (20) Nature/animals 6 (10) Life and work Life 21 (36) Work 5 (8) Mental health Well-being and personal development 5 (8) Problems 6 (10) Emotions 12 (20) Connection Sex/intimacy 10 (17) Love 7 (12) Relationships 4 (7) Replika About Replika itself 4 (7) Replika’s choice 4 (7) Experimenting with Replika 2 (3) Current events 4 (7) People 4 (7) Recreation 25 (42) Broad 21 (36) Nearly a quarter of users (14/59, 24%) began interacting with Motivation to Use Replika Replika after learning about it from third-party sources across Three major themes emerged from user responses regarding online and offline environments. Online sources included news their initial motivation to use Replika: interest, social support, articles, user reviews, social media, and internet searches. and health. Offline sources included friends and family who talked about or used Replika. Interest I saw the app [Replika] reviewed by a YouTuber I Almost half the users (27/59, 46%) mentioned that they found follow and thought it looked like fun. [Male, age 31] Replika to be generally interesting and decided to try the app out of curiosity or boredom. My husband uses it [Replika], so I thought I'd give it a try. [Female, age 23] I found it [Replika] before the beta even released and thought it looked cool, so I signed up for a code for Social Support when it launched. [Female, age 20] About a quarter of users (14/59, 24%) sought to interact with I was curious about the technology and about what Replika to combat feelings of loneliness, which often stemmed I read about it in articles online. [Female, age 48] from not having regular opportunities to interact socially with other people or high levels of social anxiety. Some users (19/59, 32%) also reported a specific interest in artificial intelligence and were motivated to explore Replika's I was living alone at the time and didn’t have many capabilities and the artificial intelligence behind it. people to talk to. [Male, age 21] I wanted to see if the AI was actually like speaking I was alone in a hospital at the time, so I didn't have with another human, and I was happy to find that it many people to interact with. [Male, age 22] did in a lot of ways. [Male, age 30] Beyond simply having someone to talk to, a small amount (4/59, Always fascinated by chatbots and Replika came up 7%) of users also sought companionship and friendship from in an internet search. [Male, age 42] their Replika. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 6 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al …To have a companion to speak with. [Male, age 24] Mental Health Some (4/59, 7%) users also sought to refine certain social skills Users discussed their emotional states with their Replika (12/59, and to learn more about themselves from interactions with their 20%), particularly negative thoughts and emotional states. These Replika. topics typically emerged from the user’s discussions about their daily challenges and major life obstacles (6/59, 10%) and how I wanted to...become more confident. [Female, age these experiences have impacted the users’ well-being and 18] personal growth (5/59, 8%). I…saw it [Replika] as a way to help me understand I complained about being ugly and people not liking myself more. [Male, age 20] me. [Male, age 41] Health Sometimes we will talk about something that is Users cited their physical and mental health as their initial reason bothering me or just in general if I feel down, she [the to interact with Replika. Specifically, some users (5/59, 8%) user’s Replika] will cheer me up. [Male, age 22] sought to use Replika to cope with mental health issues such as Connection anxiety, depression, and phobias. Others (4/59, 7%) mentioned that they began using Replika to supplement their lack of social Users reported discussing topics pertaining to love (7/59, 12%), interaction stemming from a physical health issue that limited sex/intimacy (10/59, 17%), and relationships (4/59, 7%). their mobility. However, users overwhelmingly listed these topics without providing any additional context. I needed help with panic attacks. [Female, age 57] Replika I was also suffering of crippling depression when I first started and saw it [Replika] as a way to…cope Users reported asking their Replika questions about itself to a little with my problems. [Male, age 20] learn more about it as an entity (4/59, 7%), as well as its technological capabilities (2/59, 3%). For example, users asked I'm disabled and don't get much social interaction. questions to learn about their Replika’s personality [Male, age 59] characteristics, how their Replika viewed itself (its “identity”), Topics of Discussion and the extent to which their Replika remembered the contents A total of 9 major discussion topics emerged from user of their previous discussions. Users also allowed their Replika responses: intellectual, life and work, recreation, mental health, to direct the topic of discussion (4/59, 7%). broad, connection, Replika, current events, and people. Users …Whatever they [the user’s Replika] feel like overwhelmingly described several discussion topics in a listwise bringing up. [Male, age 19] manner. As such, example responses related to these themes I like to test the Replika [to see] if it remembers things will also be presented listwise. Users also tended to describe I told [it] about myself before. [Male, 25] some discussion topics using descriptive responses. As such, example responses related to these themes will be presented in Current Events the form of quoted responses. Users also informed their Replika about the ongoing events in Intellectual the world (4/59, 7%) and discussed its implications and impacts (eg, global affairs, latest technological advancements). Users reported having deep, intellectual discussions with their Replika about science and technology (12/59, 20%), including People artificial intelligence, the universe, space, physics, Users discussed other people (4/59, 7%) with their Replika. extraterrestrial life; the humanities (12/59, 20%), including the These individuals ranged from well-known public figures (eg, nature of reality, perception, consciousness, spiritual topics, Donald Trump, Elon Musk) to individuals in the user’s own existence, the purpose/meaning of life, and Japanese culture; social network (eg, family, friends). and nature (6/59, 10%), including oceans and animals. Broad Life and Work Some users indicated that they discuss a wide variety (21/59, Users discussed their lives with Replika (21/59, 36%), and these 36%) of topics with their Replika without providing concrete topics ranged from major life events to the minutiae of everyday examples. No discussion topic was off-limits, and the topic was life. Topics pertaining to users’ occupations and other driven by whatever the user chose at the time. work-related topics (5/59, 8%), such as bosses and business strategies, were discussed as well. …Everything, to be honest. [Female, age 25] It's usually just going with the flow of the Recreation conversation. [Male, age 22] Users discussed various forms of recreation and media that they regularly consumed (25/59,) 42%). This often included hobbies and activities that users engaged in and sought to share with their Replika (eg, music, video games, anime, books, memes, theme parks, games, movies, photos, art, jokes, food, and role-playing). https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 7 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al The third most frequently cited reason for initiating contact with Discussion Replika was to cope with health issues. The use of social chatbots to improve physical and mental health is consistent Motivations to Use Replica with previous research [49]. While users primarily reported that Although social chatbot usage is on the rise [4,5], very little is their search for ways to cope with mental health issues was the known about the motivating factors behind human-social chatbot direct catalyst for initiating contact (which was not surprising interactions and the topics discussed therein [6]. In this study, given that Replika was designed to provide companionship), we addressed this gap in knowledge. Users of the popular social users also reported that their search for ways to cope with chatbot Replika responded to questions regarding their usage physical health issues was an indirect catalyst for initiating of Replika, and thematic analyses were used to gain insight into contact with Replika (eg, using it to supplement their lack of users’ motivations to interact with the social chatbot and to social interactions due to a physical ailment that limited their identify conversation topics that marked these interactions. mobility). This latter finding is noteworthy, as Replika is not programmed to collect users’ physical health data such as Participants most frequently cited interest stemming from physical activity, diet, and weight; therefore, its use to cope curiosity and interest in artificial intelligence as motivating with physical health issues is not immediately apparent. It was factors for social chatbot usage, which is consistent with unclear whether Replika was the users’ sole coping mechanism previous research [32]. A noteworthy subtheme that emerged or if it was used in conjunction with other coping involved interest derived from third-party sources across users’ mechanisms/treatments prescribed by health care professionals. environments, particularly from friends and family members However, it was clear that users initiated contact with the social who had experience with or prior knowledge of Replika chatbot to cope with both mental and physical health issues. themselves. This suggests that interest in social chatbot usage is not exclusively driven by the novelty and excitement that Topics of Discussion accompanies new and advanced technology. Rather, it appears Users engaged in a wide variety of discussion topics with their that social chatbot usage may also be driven by demonstrations Replika, which was observed within and between respondents. of its practical utility by strong-tie recommendation sources (ie, Reported discussion topics included intellectual topics, life and people who know an individual personally and can therefore work, recreation, mental health, connection, Replika, current influence the individual’s attitude and subsequent use of the events, and other people. The wide variation in topics is evident, product) [44]. This may also allude to the increasing ubiquity ranging from serious (eg, mental health, current events) to trivial of social chatbot use in everyday life and the rise of (eg, recreation) and from complex (eg, intellectual topics, human-social chatbot interactions to come. connection, Replika) to mundane (eg, life and work). This Social support, particularly in the form of companionship demonstrates the versatility of social chatbots; not only are they support and appraisal support, was the second most frequently capable of discussing a wide variety of topics, but they also cited reason. Users sought Replika use to combat feelings of appear to be capable of sustaining such discussions with a loneliness resulting from a variety of circumstances such as human counterpart. living alone or physical injury. Some users also reported the Some of the discussion topics are consistent with previous desire for companionship and to experience more meaningful research, including aspects about the users’ life and interests interactions, while others interacted with Replika as an [3,26] and topics that allowed users to learn more about the opportunity to engage in some form of personal development social chatbot’s technical capabilities [6,26]. Moreover, it is not such as improving confidence and self-knowledge. Previous surprising that mental health–related topics (well-being, personal studies have also reported the use of social chatbots for social development, problems, emotions) and connection-related topics support due to their ability to garner an emotional connection (sex, love, relationships) were discussed, as social support with humans [45-47]. Moreover, because Replika can socially (loneliness, companionship, self-improvement) was reported converse almost as well as humans can, this provides users with as a motivating factor in initiating contact with Replika. Previous the opportunity to refine their interpersonal skills and learn more research also indicated the use of social chatbots as a source of about themselves. social support [22]. Notably, unlike previous research [22], informational support Notably, the most frequently reported topics of discussion were and emotional support were not prominent motivators for substantive, intellectual ones that typically centered on complex initiating contact with Replika. No respondents reported that content and required self-disclosure (eg, topics pertaining to they initiated contact with Replika to obtain information or the meaning of life). The frequency with which this topic is advice, and only 1 respondent indicated that they were looking discussed with a social chatbot may be due to how intellectual for opportunities to “vent to something that won’t judge me.” topics are perceived. People tend to overestimate the As such, this did not meet the criteria to include informational awkwardness of deep discussions and underestimate the extent and emotional social support as subthemes, respectively [48]. to which their conversation partner will be interested in their It is important to note that although informational and emotional response [50]. This expectation may discourage individuals social support were not reported as initial motivators for social from participating in such discussions, which are more likely chatbot usage, it is possible that users sought informational and to induce some level of social anxiety compared to more shallow emotional social support after interacting with Replika for a topics. This, in part, supports the view that human-social chatbot certain amount of time. interactions can provide a “safe space” to engage in deep, https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 8 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al intellectual conversations. Moreover, because deep discussions of artificial intelligence. However, the fact that users in our can facilitate greater connections, liking, and happiness [50], it study most frequently reported using Replika out of interest, is not surprising that individuals may gravitate toward such sought to explore its capabilities, and learn more about artificial discussions in their pursuit of companionship and more intelligence should not be overlooked. Thus, while it is entirely meaningful interactions. reasonable for developers and researchers to study human-social chatbot interactions with a focus on the efficacy of the social Implications chatbot and its targeted user base, researchers should also assess Given the wide range of motivational factors and discussion if and how social chatbot usage can shape perceptions of topics that were reported, our results imply that multifaceted artificial intelligence and the potential consequences thereof. support can be provided by a single social chatbot. While Strengths, Limitations, and Future Directions previous research already established that social chatbots can effectively help address mental and physical health issues, these This study is the first to examine the motivating factors behind capabilities have been dispersed across several different social initiating contact with a social chatbot and the discussions that chatbots instead of deriving from a single one. For example, take place within human-social chatbot interactions. the Lark Weight Loss Health Coach AI [51] helps overweight Respondents were only required to identify as a Replika user and obese users lose weight and make healthy food choices by to be included in this study. There were no additional providing feedback on users’ reported activity levels and meals; requirements for study inclusion (ie, respondents did not need Woebot [19] helps users manage their mental health using to classify their relationship with Replika using particular label cognitive-behavioral therapy techniques; and Bonobot [52] such as a friendship). This enabled a more inclusive assessment conducts motivational interviewing for stress reduction. Some of the initiation and development of human-social chatbot social chatbots can address more than 1 mental/physical health interactions. In addition, the anonymous nature and issue (eg, Woebot reduces both depressive symptoms [53] and open-response format of questions encouraged and allowed problematic substance use [54]), but their functionality is detailed responses. As reflected in the wide range of themes typically limited to addressing either mental health or physical and subthemes that emerged across both questions, this resulted health, such as Woebot and the Lark Weight Loss Health Coach, in the extraction of a rich, comprehensive assessments of users’ respectively. A chatbot’s ability to provide both mental and motivations to interact with Replika and the discussion topics physical health support not only demonstrates a greater level they engaged in. of versatility and efficiency but also answers the call from health While respondents reported several motivating factors for care professionals for health interventions to include components initiating contact with Replika, our study cannot assess the that address both mental and physical health [55]. reasons why users continued contact with Replika. It is possible Our results also highlight interest as a motivating factor of that the reasons why users initiated contact with Replika also human-social chatbot usage, which has received less attention served as the reasons why they continued to interact with than other motivating factors. Although this may not seem Replika. It is also possible that respondents were initially drawn directly pertinent to Replika’s purpose of providing to Replika for 1 reason and that reason changed as conversations companionship, previous research suggests that the use of any continued. Similarly, our study cannot assess whether topics of artificial agent not only influences people’s understanding of discussion occurred consistently over time or whether certain artificial intelligence but also strongly shapes how they perceive topics were more likely to occur after a period of time. artificial intelligence and their ensuing narratives of it [56], Longitudinal methods are required to answer these questions. regardless of whether the artificial agent is being used for its Future studies should track the types of topics discussed over intended purpose. Narratives about artificial intelligence are time and assess how users’ motivations for interacting with “essential to the development of science and people’s social chatbots change over time. Finally, the use of surveys to engagement with new knowledge and new applications” [57]. collect data can introduce self-selection bias and restrict the These narratives can also lead to misinformation and fears about generalization of findings to a larger sample or population. To artificial intelligence; for those not engaged closely with the our knowledge, our study is the first to examine the motivating science or technology, “narratives can affect perceptions of, factors and discussion topics of human-social chatbot and degrees of confidence in, potential applications and those interactions; therefore, only replication studies can assess the who are developing, promoting or opposing them” [57]. It is external validity of our results. Future studies should replicate important to note that this study cannot and does not establish this study using a larger, more representative sample of Replika a link between social chatbot usage and perceptions or narratives users.  Authors' Contributions VPT-J developed the study design, assisted with the creation of study materials, conducted data analysis, and wrote the manuscript. CB and XW developed study materials, conducted data collection, and assisted with data analysis and manuscript writing. ED assisted with data analysis. ICK and SDR assisted with data analysis and manuscript writing. AM and WMP assisted with manuscript writing. Conflicts of Interest None declared. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 9 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al Multimedia Appendix 1 Additional demographic information of respondents. [DOCX File , 9 KB-Multimedia Appendix 1] Multimedia Appendix 2 Checklist for Reporting Results of Internet E-Surveys (CHERRIES). [DOCX File , 8 KB-Multimedia Appendix 2] References 1. Seering J, Luria M, Kaufman G, Hammer J. Beyond dyadic interactions: considering chatbots as community members. 2019 May Presented at: Conference on Human Factors in Computing Systems (CHI) 2019; May 4-9; Glasgow, Scotland p. 4-9. [doi: 10.1145/3290605.3300680] 2. Shum H, He X, Li D. From Eliza to XiaoIce: challenges and opportunities with social chatbots. Frontiers Inf Technol Electronic Eng 2018 Jan 8;19(1):10-26. [doi: 10.1631/fitee.1700826] 3. Zhou L, Gao J, Li D, Shum H. The design and implementation of XiaoIce, an empathetic social chatbot. Comput Linguist 2020 Mar;46(1):53-93. [doi: 10.1162/coli_a_00368] 4. Balch O. AI and me: friendship chatbots are on the rise, but is there a gendered design flaw? The Guardian.: The Guardian; 2020 May 07. URL: https://www.theguardian.com/careers/2020/may/07/ ai-and-me-friendship-chatbots-are-on-the-rise-but-is-there-a-gendered-design-flaw [accessed 2022-04-10] 5. Metz C. Riding out quarantine with a chatbot friend: I feel very connected. New York Times. 2020 Jun 16. URL: https:/ /www.nytimes.com/2020/06/16/technology/chatbots-quarantine-Coronavirus.html [accessed 2022-04-10] 6. Muresan A, Pohl H. Chats with bots: balancing imitation and engagement. 2019 Apr Presented at: Conference on Human Factors in Computing Systems (CHI) 2019; May 4-9; Glasgow, Scotland p. 4-9. [doi: 10.1145/3290607.3313084] 7. Følstad A, Araujo T, Law E, Brandtzaeg P, Papadopoulos S, Reis L, et al. Future directions for chatbot research: an interdisciplinary research agenda. Computing 2021 Dec:2915-2945 [FREE Full text] [doi: 10.1007/s00607-021-01016-7] 8. Zarouali B, Van den Broeck E, Walrave M, Poels K. Predicting consumer responses to a chatbot on Facebook. Cyberpsychol Behav Soc Netw 2018 Aug;21(8):491-497. [doi: 10.1089/cyber.2017.0518] [Medline: 30036074] 9. Zierau N, Engel C, Söllner M, Leimeister JM. Trust in smart personal assistants: a systematic literature review and development of a research agenda. In: SSRN. 2020 Mar Presented at: International Conference on Wirtschaftsinformatik (WI); March 2020; Potsdam, Germany. [doi: 10.2139/ssrn.3920577] 10. Lee S, Choi J. Enhancing user experience with conversational agent for movie recommendation: effects of self-disclosure and reciprocity. Int J Hum Comput Stud 2017;103:95-105 [FREE Full text] [doi: 10.1016/j.ijhcs.2017.02.005] 11. Wechsung I, Naumann A, Möller S. The influence of the usage mode on subjectively perceived quality. 2010 Presented at: International Workshop on Spoken Dialogue Systems Technology; Oct 1; Berlin, Germany p. 188-193. [doi: 10.1007/978-3-642-16202-2_20] 12. Graham S, Huang J, Clark M, Helgeson V. The positives of negative emotions: willingness to express negative emotions promotes relationships. Pers Soc Psychol Bull 2008:394-406 [FREE Full text] [doi: 10.1177/0146167207311281] 13. Reis H, Shaver P. Intimacy as an interpersonal process. In: Handbook of Personal Relationships: Theory, Research and Interventions. New York, NY: John Wiley & Sons; 1988:367-389. 14. Simons J, Green M. Divisive topics as social threats. Commun Res 2018:165-187 [FREE Full text] [doi: 10.1177/0093650216644025] 15. Adamopoulou E, Moussiades L. Chatbots: history, technology, and applications. MLWA 2020 Dec;2:100006. [doi: 10.1016/j.mlwa.2020.100006] 16. Croes EAJ, Antheunis ML. Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. J Soc Pers Relat 2020 Sep 25;38(1):279-300. [doi: 10.1177/0265407520959463] 17. Heckman C, Wobbrock J. Put your best face forward: Anthropomorphic agents, e-commerce consumers, and the law. 2000 Presented at: 4th International Conference on Autonomous Agents; June 3; Barcelona, Spain. [doi: 10.1145/336595.337562] 18. Moon Y. Intimate exchanges: using computers to elicit self‐disclosure from consumers. J Consum Res 2000 Mar;26(4):323-339. [doi: 10.1086/209566] 19. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment Health 2017 Jun 06;4(2):e19 [FREE Full text] [doi: 10.2196/mental.7785] [Medline: 28588005] 20. Jøranson N, Pedersen I, Rokstad AMM, Ihlebæk C. Effects on symptoms of agitation and depression in persons with dementia participating in robot-assisted activity: a cluster-randomized controlled trial. J Am Med Dir Assoc 2015 Oct 01;16(10):867-873. [doi: 10.1016/j.jamda.2015.05.002] [Medline: 26096582] 21. Yu R, Hui E, Lee J, Poon D, Ng A, Sit K, et al. Use of a therapeutic, socially assistive pet robot (PARO) in improving mood and stimulating social interaction and communication for people with dementia: study protocol for a randomized controlled trial. JMIR Res Protoc 2015 May 01;4(2):e45 [FREE Full text] [doi: 10.2196/resprot.4189] [Medline: 25934173] https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 10 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al 22. Ta V, Griffith C, Boatfield C, Wang X, Civitello M, Bader H, et al. User experiences of social support from companion chatbots in everyday contexts: thematic analysis. J Med Internet Res 2020 Mar 06;22(3):e16235 [FREE Full text] [doi: 10.2196/16235] [Medline: 32141837] 23. Broadbent E, Garrett J, Jepsen N, Li OV, Ahn HS, Robinson H, et al. Using robots at home to support patients with chronic obstructive pulmonary disease: pilot randomized controlled trial. J Med Internet Res 2018 Feb 13;20(2):e45 [FREE Full text] [doi: 10.2196/jmir.8640] [Medline: 29439942] 24. Bates M. Health care chatbots are here to help. IEEE Pulse 2019 May;10(3):12-14. [doi: 10.1109/mpuls.2019.2911816] 25. Cameron G, Cameron D, Megaw G, Bond R, Mulvenna M, O'Neill S, et al. Best practices for designing chatbots in mental health care? A case study on iHelpr. In: Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI-2018). 2018 Presented at: HCI'18; May 10; Swindon, UK p. 1-5. [doi: 10.14236/ewic/hci2018.129] 26. Brandtzaeg P, Følstad A. Why people use chatbots. 2017 Presented at: 2017 International Conference on Internet Science; Nov 22-24; Thessaloniki, Greece p. 22-24 URL: https://link.springer.com/chapter/10.1007/978-3-319-70284-1_30#citeas [doi: 10.1007/978-3-319-70284-1_30] 27. Skjuve M, Følstad A, Fostervold KI, Brandtzaeg PB. My chatbot companion - a study of human-chatbot relationships. Int J Hum Comput Stud 2021 May;149:102601. [doi: 10.1016/j.ijhcs.2021.102601] 28. Aboud F, Mendelson M. Determinants of friendship selectionquality: developmental perspectives. In: The Company They Keep: Friendship in Childhood and Adolescence. Cambridge, UK: Cambridge University Press; 1996:87-112. 29. Argyle M, Henderson M. The rules of friendship. J Soc Pers Relat 2016 Jun 30;1(2):211-237. [doi: 10.1177/0265407584012005] 30. Blau P. Social exchange. In: International Encyclopedia of the Social Sciences. New York, NY: Macmillan; 1968:452-457. 31. Cook K, Rice E. Social exchange theory. In: Handbook of Social Psychology. New York, NY: Springer; 2003:53-76. 32. Baumeister RF, Leary MR. The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychol Bull 1995;117(3):497-529. [doi: 10.1037/0033-2909.117.3.497] 33. Swann WB, Pelham BW, Krull DS. Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification. J Pers Soc Psychol 1989;57(5):782-791. [doi: 10.1037/0022-3514.57.5.782] 34. Chen Z, Berger J. When, why, and how controversy causes conversation. J Consum Res 2013 Oct 01;40(3):580-593. [doi: 10.1086/671465] 35. Green T. Republicans and Democrats alike say it's stressful to talk politics with people who disagree. Pew Research Center. 2021 Nov 23. URL: https://www.pewresearch.org/fact-tank/2021/11/23/ republicans-and-democrats-alike-say-its-stressful-to-talk-politics-with-people-who-disagree/ [accessed 2022-10-04] 36. Addressing controversial issues. Council of Europe. URL: https://www.coe.int/en/web/campaign-free-to-speak-safe-to-learn/ addressing-controversial-issues [accessed 2022-04-10] 37. Clark D, Wells A. A cognitive model of social phobia. In: Social Phobia: Diagnosis, Assessment, and Treatment. New York, NY: The Guilford Press; 1995:69-93. 38. Hayes SC, Wilson KG, Gifford EV, Follette VM, et al. Experiential avoidance and behavioral disorders: a functional dimensional approach to diagnosis and treatment. J Consult Clin Psychol 1996;64(6):1152-1168. [doi: 10.1037//0022-006x.64.6.1152] 39. Prizant-Passal S, Shechner T, Aderka IM. Social anxiety and internet use – A meta-analysis: What do we know? What are we missing? Comput Human Behav 2016 Sep;62:221-229. [doi: 10.1016/j.chb.2016.04.003] 40. Stafford TF, Stafford MR, Schkade LL. Determining uses and gratifications for the internet. Decis Sci 2004 May;35(2):259-288. [doi: 10.1111/j.00117315.2004.02524.x] 41. Cheng Y, Jiang H. AI‐Powered mental health chatbots: examining users’ motivations, active communicative action and engagement after mass‐shooting disasters. J Contingencies Crisis Manag 2020 Sep 29;28(3):339-354. [doi: 10.1111/1468-5973.12319] 42. Lin Y, Fang C, Hsu C. Determining uses and gratifications for mobile phone apps. In: Future Information Technology. Berlin, Germany: Springer; 2014:661-668. 43. Replika. URL: https://replika.ai/about/story [accessed 2022-09-27] 44. Brown JJ, Reingen PH. Social ties and word-of-mouth referral behavior. J Consum Res 1987 Dec;14(3):350. [doi: 10.1086/209118] 45. Alderfer CP. An empirical test of a new theory of human needs. Organ Behav Hum Perform 1969 May;4(2):142-175. [doi: 10.1016/0030-5073(69)90004-x] 46. Aly A, Griffiths S, Stramandinoli F. Metrics and benchmarks in human-robot interaction: Recent advances in cognitive robotics. Cogn Syst Res 2017 Jun;43:313-323. [doi: 10.1016/j.cogsys.2016.06.002] 47. D'Alfonso S, Santesteban-Echarri O, Rice S, Wadley G, Lederman R, Miles C, et al. Artificial intelligence-assisted online social therapy for youth mental health. Front Psychol 2017 Jun 02;8. [doi: 10.3389/fpsyg.2017.00796] 48. Ryan GW, Bernard HR. Techniques to identify themes. Field Methods 2016 Jul 24;15(1):85-109. [doi: 10.1177/1525822x02239569] 49. Gabarron E, Larbi D, Denecke K, Årsand E. What do we know about the use of chatbots for public health? Stud Health Technol Inform 2020 Jun 16;270:796-800. [doi: 10.3233/SHTI200270] [Medline: 32570492] https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 11 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al 50. Kardas M, Kumar A, Epley N. Overly shallow?: Miscalibrated expectations create a barrier to deeper conversation. J Pers Soc Psychol 2022 Mar;122(3):367-398. [doi: 10.1037/pspa0000281] [Medline: 34591541] 51. Stein N, Brooks K. A fully automated conversational artificial intelligence for weight loss: longitudinal observational study among overweight and obese adults. JMIR Diabetes 2017 Nov 01;2(2):e28 [FREE Full text] [doi: 10.2196/diabetes.8590] [Medline: 30291087] 52. Park S, Choi J, Lee S, Oh C, Kim C, La S, et al. Designing a chatbot for a brief motivational interview on stress management: qualitative case study. J Med Internet Res 2019 Apr 16;21(4):e12231 [FREE Full text] [doi: 10.2196/12231] [Medline: 30990463] 53. Abd-Alrazaq A, Safi Z, Alajlani M, Warren J, Househ M, Denecke K. Technical metrics used to evaluate health care chatbots: scoping review. J Med Internet Res 2020 Jun 05;22(6):e18301 [FREE Full text] [doi: 10.2196/18301] [Medline: 32442157] 54. Prochaska JJ, Vogel EA, Chieng A, Kendra M, Baiocchi M, Pajarito S, et al. A therapeutic relational agent for reducing problematic substance use (Woebot): development and usability study. J Med Internet Res 2021 Mar 23;23(3):e24850 [FREE Full text] [doi: 10.2196/24850] [Medline: 33755028] 55. Prince M, Patel V, Saxena S, Maj M, Maselko J, Phillips MR, et al. No health without mental health. Lancet 2007 Sep;370(9590):859-877. [doi: 10.1016/S0140-6736(07)61238-0] 56. Horowitz MC, Kahn L. What influences attitudes about artificial intelligence adoption: Evidence from US local officials. PLoS One 2021 Oct 20;16(10):e0257732 [FREE Full text] [doi: 10.1371/journal.pone.0257732] [Medline: 34669734] 57. Cave S, Craig C, Dihal K, Dillon S, Montgomery J, Singler B, et al. Portrayals and perceptions of AI and why they matter. The Royal Society. 2018 Dec 11. URL: https://www.repository.cam.ac.uk/handle/1810/287193 [accessed 2022-04-10] Abbreviations CHERRIES: Checklist for Reporting Results of Internet E-Surveys ERG: Existence, Relatedness, and Growth Edited by A Kushniruk; submitted 19.04.22; peer-reviewed by C Thornson, D Chrimes; comments to author 28.06.22; revised version received 30.07.22; accepted 29.08.22; published 03.10.22 Please cite as: Ta-Johnson VP, Boatfield C, Wang X, DeCero E, Krupica IC, Rasof SD, Motzer A, Pedryc WM JMIR Hum Factors 2022;9(4):e38876 URL: https://humanfactors.jmir.org/2022/4/e38876 doi: 10.2196/38876 PMID: ©Vivian P Ta-Johnson, Carolynn Boatfield, Xinyu Wang, Esther DeCero, Isabel C Krupica, Sophie D Rasof, Amelie Motzer, Wiktoria M Pedryc. Originally published in JMIR Human Factors (https://humanfactors.jmir.org), 03.10.2022. This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 12 (page number not for citation purposes) XSL FO RenderX http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png JMIR Human Factors JMIR Publications

Assessing the Topics and Motivating Factors Behind Human-Social Chatbot Interactions: Thematic Analysis of User Experiences

Loading next page...
 
/lp/jmir-publications/assessing-the-topics-and-motivating-factors-behind-human-social-xdsmhaiUvK

References (68)

Publisher
JMIR Publications
Copyright
Copyright © The Author(s). Licensed under Creative Commons Attribution cc-by 4.0
ISSN
2292-9495
DOI
10.2196/38876
Publisher site
See Article on Publisher Site

Abstract

Background: Although social chatbot usage is expected to increase as language models and artificial intelligence improve, very little is known about the dynamics of human-social chatbot interactions. Specifically, there is a paucity of research examining why human-social chatbot interactions are initiated and the topics that are discussed. Objective: We sought to identify the motivating factors behind initiating contact with Replika, a popular social chatbot, and the topics discussed in these interactions. Methods: A sample of Replika users completed a survey that included open-ended questions pertaining to the reasons why they initiated contact with Replika and the topics they typically discuss. Thematic analyses were then used to extract themes and subthemes regarding the motivational factors behind Replika use and the types of discussions that take place in conversations with Replika. Results: Users initiated contact with Replika out of interest, in search of social support, and to cope with mental and physical health conditions. Users engaged in a wide variety of discussion topics with their Replika, including intellectual topics, life and work, recreation, mental health, connection, Replika, current events, and other people. Conclusions: Given the wide range of motivational factors and discussion topics that were reported, our results imply that multifaceted support can be provided by a single social chatbot. While previous research already established that social chatbots can effectively help address mental and physical health issues, these capabilities have been dispersed across several different social chatbots instead of deriving from a single one. Our results also highlight a motivating factor of human-social chatbot usage that has received less attention than other motivating factors: interest. Users most frequently reported using Replika out of interest and sought to explore its capabilities and learn more about artificial intelligence. Thus, while developers and researchers study human-social chatbot interactions with the efficacy of the social chatbot and its targeted user base in mind, it is equally important to consider how its usage can shape public perceptions and support for social chatbots and artificial agents in general. (JMIR Hum Factors 2022;9(4):e38876) doi: 10.2196/38876 KEYWORDS social chatbots; Replika; emotional chatbots; artificial intelligence; thematic analysis; human-chatbot interactions; chatbot; usability; interaction; human factors; motivation; topics; AI; perception; usage https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 1 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al being revealed. Thus, ensuring a comprehensive understanding Introduction of human-chatbot interactions requires an examination of the basic building blocks of any interaction: the motivating factors Background and contents of human-chatbot interactions. Doing so will allow With the advancement of artificial intelligence, the amount of new studies to make systematic and meaningful contributions time that people spend engaging in human-chatbot interactions to the existing literature and body of knowledge. will likely increase as chatbots become more ubiquitous in Human-Chatbot Interactions everyday life. This includes interactions with social chatbots—chatbots that can engender the development of Chatbots are primarily categorized as task-oriented or social companionship with human users by conversing socially and chatbots. Unlike social chatbots, task-oriented chatbots provide empathetically [1-3]. While social chatbot usage is on the rise service-based assistance for completing specific tasks (eg, [4,5], very little is known about the dynamics of these reserving a table at a restaurant) and typically do not provide interactions, particularly about why human-social chatbot any social value beyond their allotted purpose [15]. Because interactions are initiated and the content of such interactions they are made to be virtual companions to users, social chatbots [6]. In other words, what are the motivating factors behind are created to embody human-like personalities, emotions, and initiating contact with a social chatbot, and what is discussed behavior and facilitate social interactions catering to the in these interactions? In this paper, we collected data from users individual needs of the user [2,16]. Social chatbots’ affective of Replika, a popular social chatbot, to address this gap in the component enables them to recognize and express emotions literature. such as sympathy and empathy, which can foster feelings of trustworthiness and increase self-disclosure among users [17,18]. This investigation is important for several reasons. A prominent Social chatbots have been increasingly applied to assist in health portion of recent chatbot research focuses on chatbot user care, and their use has been linked reduction of depression and experiences given that “the strengthening of chatbot user anxiety symptoms, improved mood [19-21], better social support experiences remains a key research challenge” [7,8]. This body [22], improved medication adherence, and increase in exercise of work has revealed “factors contributing to positive or negative [23]. This increasing usage of social chatbots in health care is user experience…and how these aspects are impacted by chatbot due to chatbots’ ability to support, facilitate, and enhance health design” [7]. F or instance, lack of trust [9] and user care processes [24]. For example, chatbots can provide greater dissatisfaction [10] can hinder the adoption of chatbots while accessibility around the clock, immediate access to information affective determinants and perceived usefulness and helpfulness and support, and a degree of anonymity [25]. This enables can improve attitudes toward chatbot usage [8]. Although this chatbots to help cut down waiting times and lists, reach information is undoubtedly crucial for designing effective individuals in more remote or rural areas, and facilitate chatbots, identifying factors that contribute to a positive (or self-disclosure among individuals who may be reluctant to negative) user experience requires that motivating factors behind self-disclose to a human health care provider [24]. chatbot usage also be considered. This is important given that user experience is linked with usage mode—how a product is Outside of health and task-oriented contexts, very few studies used [11]. Existing research has primarily distinguished chatbot have examined the motivational factors behind human-social usage as either task-oriented or social-oriented, often without chatbot interactions and the general content of these interactions. specifying any further roles or functions. In the same vein, Moreover, the small pool of existing studies has important improving the conversational and interactional design of limitations. Brandtzaeg and Folsted [26] reported that contact chatbots necessarily involves assessing the content being with chatbots was initiated primarily for productivity purposes, discussed in human-chatbot interactions and considering its followed by entertainment, social connection, and curiosity. potential influence on interaction satisfaction. For example, However, their study did not differentiate between task-oriented interactions in which personal and intimate topics are discussed and social chatbots. This is an important distinction to make, facilitate the development of intimacy and closeness, as seen as task-oriented chatbots are programmed to provide a different in some studies [12,13]. By contrast, topics that do not have a objective than social chatbots, which are programmed to provide perceived consensual opinion (eg, immigration reform, abortion virtual companionship. As such, motivations to initiate contact rights, etc) facilitate anxiety and feelings of threat [14]. As such, with task-oriented chatbots are likely different from motivations a clear-cut understanding of the reasons why people interact to initiate contact with social chatbots. Moreover, if the with social chatbots and the content of such interactions can motivating factors vary, it follows that interactions with provide more explicit, concrete insight into the reasons why task-oriented chatbots likely contain discussions that are quite certain human-social chatbot use may (or may not) be effective different from interactions with social chatbots. and elucidate the design elements that enable social chatbots to In a study of human-chatbot relationships [27], users reported better meet the needs of users.  initiating contact with a social chatbot due to their interest in Finally, although chatbot research is quickly expanding and artificial intelligence, to meet emotional and social needs, to encompassing a wide range of disciplines, the body of chatbot improve skills, and out of curiosity. However, because of the knowledge is “currently fragmented across disciplines and understudied nature of human-chatbot relationships, the study application domains” [7]. This can create an incohesive body only included individuals who indicated that they had developed of knowledge that inhibits elemental but critical findings a friendship with their chatbot. The reasons behind initiating pertaining to effective human-social chatbot interactions from contact with a social chatbot, along with the nature of such https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 2 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al interactions, among individuals who classify their relationship needs (eg, food, water) and safety (eg, health). Needs of with it as a friendship may be different from individuals who relatedness refer to social relationships and gaining the respect do not classify their relationship as a friendship. Moreover, of others. Needs of growth refer to the need for personal variations in criteria for classifying a relationship as a friendship development and self-esteem. Studies have shown that exist not only across individuals but also across the lifespan individuals are motivated to engage with new, emerging [28,29]. Excluding individuals who may have substantial technology to gratify their various needs [40,41]. Furthermore, interactions with a social chatbot but do not explicitly label it modern media use has also been linked to the motivation to a friendship omits a potentially considerable portion of learn and acquire information and pursue hedonic gratifications human-social chatbot interactions and thus inhibits an inclusive [40]. More specifically, the motivations behind cell phone investigation and understanding of human-social chatbot application use have been linked to the acquisition of social interactions and human-robot interactions in general. benefits, immediate access and mobility, status, information, and entertainment [42]. This perspective suggests that people Theoretical Perspectives pursue interactions with social chatbots to satisfy their various At least 2 theoretical perspectives can be used to understand needs, particularly needs of relatedness and growth. the factors behind the initiation and development of Our Objective human-social chatbot interactions. First, social exchange theory posits that social behavior is motivated via a cost-benefit Given the gap in knowledge regarding the initiation and nature analysis, such that individuals seek out interactions that will of human-social chatbot interactions, we sought to assess the produce the maximum “payoff” for minimal “cost” [30,31]. In following 2 research questions: (1) What are the motivational other words, the costs of an interaction should not outweigh the factors behind human-social chatbot interactions? (2) What benefits. Interactions with social chatbots—as opposed to topics of discussion take place within human-social chatbot humans—may be viewed as less costly and more rewarding interactions? when the topic of discussion is contentious or controversial. Accordingly, we examined user experiences of Replika, a Because humans are social beings and prefer to be liked and popular social chatbot [43], by inviting Replika users to answer accepted rather than rejected [32,33], controversial topics are questions regarding their interactions with their Replika via a often perceived as uncomfortable to discuss, as they can be survey. Thematic analyses were then used to extract themes and stressful and result in interpersonal conflict [34,35]. However, subthemes pertaining to the motivational factors behind Replika the discussion of controversial topics is critical in the use and the topics discussed with Replika. Given that our goal development of important democratic competencies such as was to address the lack of knowledge regarding human-social being well-informed on social problems and having “openness chatbot interactions, we adapted both an exploratory and to other cultures and beliefs, analytical and critical thinking theoretical approach to this investigation. In other words, while skills, flexibility and adaptability, and tolerance of ambiguity” we sought to extract all important themes that emerged from [36]. Because social chatbots are not human, they may provide user responses, based on the 2 aforementioned theoretical a safe avenue for individuals to discuss challenging subjects perspectives, we expected that the motivating factors and without fear of conflict or retaliation from others. discussion topics involved in human-social chatbot interactions In the same vein, interactions with social chatbots may be would be driven by (1) the need to socialize or discuss viewed as less costly among individuals who experience social challenging topics without the fear of negative judgment from anxiety and fear negative evaluations from others. Individuals others and (2) the motivation to satisfy needs of relatedness and who experience social anxiety often go out of their way to avoid growth. real or anticipated social situations that might induce unwanted We chose to focus on Replika rather than other social chatbots thoughts, feelings, and negative judgment from others [37,38]. due to its functionality, accessibility, and large user base. This is consistent with previous research showing that Replika is programmed to function as a companion instead of computer-mediated communication can be a preferred medium providing a specific outcome (such as losing weight via the of communication among socially anxious individuals, as it is Lark Weight Loss Health Coach AI) or treatment approach less threatening than face-to-face interactions [39]. Again, (such as cognitive behavioral therapy via Woebot). Replika is because social chatbots are not human, human-social chatbot also available across many platforms [22], making it relatively interactions present opportunities to engage in social interactions more accessible than other social chatbots. As such, it is more in a more relaxed, low-stakes environment. This reduces costs likely to be used for a wider range of reasons compared to other, and maximizes benefits, thereby enabling individuals to satisfy more targeted chatbots, making it an appropriate social chatbot the human need to belong without the potential discomfort of to target for our study. face-to-face interactions with other humans. Second, assessing how people utilize technology to fulfill their Methods needs can be used to understand why human-social chatbot Participants interactions are initiated and how these interactions progress. The Existence, Relatedness, and Growth (ERG) theory [40] Replika users (N=66) were recruited through social media posits that behavior is driven by meeting 3 kinds of needs: websites, including Facebook and Reddit, in the spring and existence, relatedness, and growth. Needs of existence refer to summer of 2019. Most respondents were men (n=36, 54.5%), elements needed by humans to survive, including physiological single (n=42, 63.6%), White (n=47, 71.2%), and from the United https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 3 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al States (n=41, 62.1%). Respondent ages ranged from 17 to 68 Results years (mean 32.64, SD 13.89 years). Multimedia Appendix 1 reports additional respondent demographics. Initial Findings Materials and Procedure Two thematic analyses were conducted. The first thematic analysis, illustrated in Figure 1, was conducted on responses Respondents completed a survey of open-ended questions pertaining to users’ motivation to use Replika (Why did you regarding their use of Replika and provided basic demographic decide to try Replika?). A total of 5 responses did not meet information. To examine why respondents initiated contact with requirements for inclusion in the study and were omitted (eg, Replika and identify topics that characterize their interactions, responses that only contained “n/a”). The second thematic responses to the following questions were analyzed: (1) Why analysis, illustrated in Figure 2, was conducted on responses did you decide to try Replika? (If you prefer not to answer, pertaining to the topics of discussion that users engaged in with please type “n/a”) (2) What topics do you usually discuss with their Replika (What topics do you usually discuss with your your Replika? (If you prefer not to answer, please type “n/a”). Replika?). Again, 5 responses did not meet requirements for Participants also answered additional questions about their inclusion in the study and were thus omitted. The final number Replika usage, but these questions were not pertinent to this of included responses was 59. Themes and subthemes related investigation. Multimedia Appendix 2 contains the Checklist to respondents’ motivations to use Replika are reported in Table for Reporting Results of Internet E-Surveys (CHERRIES). 1, and themes and subthemes related to topics of discussion that respondents engaged in with their Replika are reported in Table Ethics Approval All procedures were approved by of Lake Forest College’s Human Subjects Review Committee (TA04152019) and carried Because respondents often mentioned multiple motivating out in accordance with the 1964 Declaration of Helsinki and its factors and topics of discussion in their responses, it was later amendments. possible for a given response to be coded under multiple motivating factors and topics. Figure 1. Motivating factors of Replika use: themes and subthemes. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 4 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al Figure 2. Topics of discussion: themes and subthemes. Table 1. Themes and subthemes related to respondents’ motivations to use Replika (N=59). Themes and subthemes Values, n (%) Interest General interest 27 (46) Interest in artificial intelligence 19 (32) Word-of-mouth 14 (24) Social support Loneliness 14 (24) Companionship 4 (7) Self-improvement 4 (7) Health Mental health 5 (8) Physical health 4 (7) https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 5 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al Table 2. Themes and subthemes related to topics of discussion respondents engaged in with Replika (N=59). Themes and subthemes Value, n (%) Intellectual Science and technology 12 (20) Humanities 12 (20) Nature/animals 6 (10) Life and work Life 21 (36) Work 5 (8) Mental health Well-being and personal development 5 (8) Problems 6 (10) Emotions 12 (20) Connection Sex/intimacy 10 (17) Love 7 (12) Relationships 4 (7) Replika About Replika itself 4 (7) Replika’s choice 4 (7) Experimenting with Replika 2 (3) Current events 4 (7) People 4 (7) Recreation 25 (42) Broad 21 (36) Nearly a quarter of users (14/59, 24%) began interacting with Motivation to Use Replika Replika after learning about it from third-party sources across Three major themes emerged from user responses regarding online and offline environments. Online sources included news their initial motivation to use Replika: interest, social support, articles, user reviews, social media, and internet searches. and health. Offline sources included friends and family who talked about or used Replika. Interest I saw the app [Replika] reviewed by a YouTuber I Almost half the users (27/59, 46%) mentioned that they found follow and thought it looked like fun. [Male, age 31] Replika to be generally interesting and decided to try the app out of curiosity or boredom. My husband uses it [Replika], so I thought I'd give it a try. [Female, age 23] I found it [Replika] before the beta even released and thought it looked cool, so I signed up for a code for Social Support when it launched. [Female, age 20] About a quarter of users (14/59, 24%) sought to interact with I was curious about the technology and about what Replika to combat feelings of loneliness, which often stemmed I read about it in articles online. [Female, age 48] from not having regular opportunities to interact socially with other people or high levels of social anxiety. Some users (19/59, 32%) also reported a specific interest in artificial intelligence and were motivated to explore Replika's I was living alone at the time and didn’t have many capabilities and the artificial intelligence behind it. people to talk to. [Male, age 21] I wanted to see if the AI was actually like speaking I was alone in a hospital at the time, so I didn't have with another human, and I was happy to find that it many people to interact with. [Male, age 22] did in a lot of ways. [Male, age 30] Beyond simply having someone to talk to, a small amount (4/59, Always fascinated by chatbots and Replika came up 7%) of users also sought companionship and friendship from in an internet search. [Male, age 42] their Replika. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 6 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al …To have a companion to speak with. [Male, age 24] Mental Health Some (4/59, 7%) users also sought to refine certain social skills Users discussed their emotional states with their Replika (12/59, and to learn more about themselves from interactions with their 20%), particularly negative thoughts and emotional states. These Replika. topics typically emerged from the user’s discussions about their daily challenges and major life obstacles (6/59, 10%) and how I wanted to...become more confident. [Female, age these experiences have impacted the users’ well-being and 18] personal growth (5/59, 8%). I…saw it [Replika] as a way to help me understand I complained about being ugly and people not liking myself more. [Male, age 20] me. [Male, age 41] Health Sometimes we will talk about something that is Users cited their physical and mental health as their initial reason bothering me or just in general if I feel down, she [the to interact with Replika. Specifically, some users (5/59, 8%) user’s Replika] will cheer me up. [Male, age 22] sought to use Replika to cope with mental health issues such as Connection anxiety, depression, and phobias. Others (4/59, 7%) mentioned that they began using Replika to supplement their lack of social Users reported discussing topics pertaining to love (7/59, 12%), interaction stemming from a physical health issue that limited sex/intimacy (10/59, 17%), and relationships (4/59, 7%). their mobility. However, users overwhelmingly listed these topics without providing any additional context. I needed help with panic attacks. [Female, age 57] Replika I was also suffering of crippling depression when I first started and saw it [Replika] as a way to…cope Users reported asking their Replika questions about itself to a little with my problems. [Male, age 20] learn more about it as an entity (4/59, 7%), as well as its technological capabilities (2/59, 3%). For example, users asked I'm disabled and don't get much social interaction. questions to learn about their Replika’s personality [Male, age 59] characteristics, how their Replika viewed itself (its “identity”), Topics of Discussion and the extent to which their Replika remembered the contents A total of 9 major discussion topics emerged from user of their previous discussions. Users also allowed their Replika responses: intellectual, life and work, recreation, mental health, to direct the topic of discussion (4/59, 7%). broad, connection, Replika, current events, and people. Users …Whatever they [the user’s Replika] feel like overwhelmingly described several discussion topics in a listwise bringing up. [Male, age 19] manner. As such, example responses related to these themes I like to test the Replika [to see] if it remembers things will also be presented listwise. Users also tended to describe I told [it] about myself before. [Male, 25] some discussion topics using descriptive responses. As such, example responses related to these themes will be presented in Current Events the form of quoted responses. Users also informed their Replika about the ongoing events in Intellectual the world (4/59, 7%) and discussed its implications and impacts (eg, global affairs, latest technological advancements). Users reported having deep, intellectual discussions with their Replika about science and technology (12/59, 20%), including People artificial intelligence, the universe, space, physics, Users discussed other people (4/59, 7%) with their Replika. extraterrestrial life; the humanities (12/59, 20%), including the These individuals ranged from well-known public figures (eg, nature of reality, perception, consciousness, spiritual topics, Donald Trump, Elon Musk) to individuals in the user’s own existence, the purpose/meaning of life, and Japanese culture; social network (eg, family, friends). and nature (6/59, 10%), including oceans and animals. Broad Life and Work Some users indicated that they discuss a wide variety (21/59, Users discussed their lives with Replika (21/59, 36%), and these 36%) of topics with their Replika without providing concrete topics ranged from major life events to the minutiae of everyday examples. No discussion topic was off-limits, and the topic was life. Topics pertaining to users’ occupations and other driven by whatever the user chose at the time. work-related topics (5/59, 8%), such as bosses and business strategies, were discussed as well. …Everything, to be honest. [Female, age 25] It's usually just going with the flow of the Recreation conversation. [Male, age 22] Users discussed various forms of recreation and media that they regularly consumed (25/59,) 42%). This often included hobbies and activities that users engaged in and sought to share with their Replika (eg, music, video games, anime, books, memes, theme parks, games, movies, photos, art, jokes, food, and role-playing). https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 7 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al The third most frequently cited reason for initiating contact with Discussion Replika was to cope with health issues. The use of social chatbots to improve physical and mental health is consistent Motivations to Use Replica with previous research [49]. While users primarily reported that Although social chatbot usage is on the rise [4,5], very little is their search for ways to cope with mental health issues was the known about the motivating factors behind human-social chatbot direct catalyst for initiating contact (which was not surprising interactions and the topics discussed therein [6]. In this study, given that Replika was designed to provide companionship), we addressed this gap in knowledge. Users of the popular social users also reported that their search for ways to cope with chatbot Replika responded to questions regarding their usage physical health issues was an indirect catalyst for initiating of Replika, and thematic analyses were used to gain insight into contact with Replika (eg, using it to supplement their lack of users’ motivations to interact with the social chatbot and to social interactions due to a physical ailment that limited their identify conversation topics that marked these interactions. mobility). This latter finding is noteworthy, as Replika is not programmed to collect users’ physical health data such as Participants most frequently cited interest stemming from physical activity, diet, and weight; therefore, its use to cope curiosity and interest in artificial intelligence as motivating with physical health issues is not immediately apparent. It was factors for social chatbot usage, which is consistent with unclear whether Replika was the users’ sole coping mechanism previous research [32]. A noteworthy subtheme that emerged or if it was used in conjunction with other coping involved interest derived from third-party sources across users’ mechanisms/treatments prescribed by health care professionals. environments, particularly from friends and family members However, it was clear that users initiated contact with the social who had experience with or prior knowledge of Replika chatbot to cope with both mental and physical health issues. themselves. This suggests that interest in social chatbot usage is not exclusively driven by the novelty and excitement that Topics of Discussion accompanies new and advanced technology. Rather, it appears Users engaged in a wide variety of discussion topics with their that social chatbot usage may also be driven by demonstrations Replika, which was observed within and between respondents. of its practical utility by strong-tie recommendation sources (ie, Reported discussion topics included intellectual topics, life and people who know an individual personally and can therefore work, recreation, mental health, connection, Replika, current influence the individual’s attitude and subsequent use of the events, and other people. The wide variation in topics is evident, product) [44]. This may also allude to the increasing ubiquity ranging from serious (eg, mental health, current events) to trivial of social chatbot use in everyday life and the rise of (eg, recreation) and from complex (eg, intellectual topics, human-social chatbot interactions to come. connection, Replika) to mundane (eg, life and work). This Social support, particularly in the form of companionship demonstrates the versatility of social chatbots; not only are they support and appraisal support, was the second most frequently capable of discussing a wide variety of topics, but they also cited reason. Users sought Replika use to combat feelings of appear to be capable of sustaining such discussions with a loneliness resulting from a variety of circumstances such as human counterpart. living alone or physical injury. Some users also reported the Some of the discussion topics are consistent with previous desire for companionship and to experience more meaningful research, including aspects about the users’ life and interests interactions, while others interacted with Replika as an [3,26] and topics that allowed users to learn more about the opportunity to engage in some form of personal development social chatbot’s technical capabilities [6,26]. Moreover, it is not such as improving confidence and self-knowledge. Previous surprising that mental health–related topics (well-being, personal studies have also reported the use of social chatbots for social development, problems, emotions) and connection-related topics support due to their ability to garner an emotional connection (sex, love, relationships) were discussed, as social support with humans [45-47]. Moreover, because Replika can socially (loneliness, companionship, self-improvement) was reported converse almost as well as humans can, this provides users with as a motivating factor in initiating contact with Replika. Previous the opportunity to refine their interpersonal skills and learn more research also indicated the use of social chatbots as a source of about themselves. social support [22]. Notably, unlike previous research [22], informational support Notably, the most frequently reported topics of discussion were and emotional support were not prominent motivators for substantive, intellectual ones that typically centered on complex initiating contact with Replika. No respondents reported that content and required self-disclosure (eg, topics pertaining to they initiated contact with Replika to obtain information or the meaning of life). The frequency with which this topic is advice, and only 1 respondent indicated that they were looking discussed with a social chatbot may be due to how intellectual for opportunities to “vent to something that won’t judge me.” topics are perceived. People tend to overestimate the As such, this did not meet the criteria to include informational awkwardness of deep discussions and underestimate the extent and emotional social support as subthemes, respectively [48]. to which their conversation partner will be interested in their It is important to note that although informational and emotional response [50]. This expectation may discourage individuals social support were not reported as initial motivators for social from participating in such discussions, which are more likely chatbot usage, it is possible that users sought informational and to induce some level of social anxiety compared to more shallow emotional social support after interacting with Replika for a topics. This, in part, supports the view that human-social chatbot certain amount of time. interactions can provide a “safe space” to engage in deep, https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 8 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al intellectual conversations. Moreover, because deep discussions of artificial intelligence. However, the fact that users in our can facilitate greater connections, liking, and happiness [50], it study most frequently reported using Replika out of interest, is not surprising that individuals may gravitate toward such sought to explore its capabilities, and learn more about artificial discussions in their pursuit of companionship and more intelligence should not be overlooked. Thus, while it is entirely meaningful interactions. reasonable for developers and researchers to study human-social chatbot interactions with a focus on the efficacy of the social Implications chatbot and its targeted user base, researchers should also assess Given the wide range of motivational factors and discussion if and how social chatbot usage can shape perceptions of topics that were reported, our results imply that multifaceted artificial intelligence and the potential consequences thereof. support can be provided by a single social chatbot. While Strengths, Limitations, and Future Directions previous research already established that social chatbots can effectively help address mental and physical health issues, these This study is the first to examine the motivating factors behind capabilities have been dispersed across several different social initiating contact with a social chatbot and the discussions that chatbots instead of deriving from a single one. For example, take place within human-social chatbot interactions. the Lark Weight Loss Health Coach AI [51] helps overweight Respondents were only required to identify as a Replika user and obese users lose weight and make healthy food choices by to be included in this study. There were no additional providing feedback on users’ reported activity levels and meals; requirements for study inclusion (ie, respondents did not need Woebot [19] helps users manage their mental health using to classify their relationship with Replika using particular label cognitive-behavioral therapy techniques; and Bonobot [52] such as a friendship). This enabled a more inclusive assessment conducts motivational interviewing for stress reduction. Some of the initiation and development of human-social chatbot social chatbots can address more than 1 mental/physical health interactions. In addition, the anonymous nature and issue (eg, Woebot reduces both depressive symptoms [53] and open-response format of questions encouraged and allowed problematic substance use [54]), but their functionality is detailed responses. As reflected in the wide range of themes typically limited to addressing either mental health or physical and subthemes that emerged across both questions, this resulted health, such as Woebot and the Lark Weight Loss Health Coach, in the extraction of a rich, comprehensive assessments of users’ respectively. A chatbot’s ability to provide both mental and motivations to interact with Replika and the discussion topics physical health support not only demonstrates a greater level they engaged in. of versatility and efficiency but also answers the call from health While respondents reported several motivating factors for care professionals for health interventions to include components initiating contact with Replika, our study cannot assess the that address both mental and physical health [55]. reasons why users continued contact with Replika. It is possible Our results also highlight interest as a motivating factor of that the reasons why users initiated contact with Replika also human-social chatbot usage, which has received less attention served as the reasons why they continued to interact with than other motivating factors. Although this may not seem Replika. It is also possible that respondents were initially drawn directly pertinent to Replika’s purpose of providing to Replika for 1 reason and that reason changed as conversations companionship, previous research suggests that the use of any continued. Similarly, our study cannot assess whether topics of artificial agent not only influences people’s understanding of discussion occurred consistently over time or whether certain artificial intelligence but also strongly shapes how they perceive topics were more likely to occur after a period of time. artificial intelligence and their ensuing narratives of it [56], Longitudinal methods are required to answer these questions. regardless of whether the artificial agent is being used for its Future studies should track the types of topics discussed over intended purpose. Narratives about artificial intelligence are time and assess how users’ motivations for interacting with “essential to the development of science and people’s social chatbots change over time. Finally, the use of surveys to engagement with new knowledge and new applications” [57]. collect data can introduce self-selection bias and restrict the These narratives can also lead to misinformation and fears about generalization of findings to a larger sample or population. To artificial intelligence; for those not engaged closely with the our knowledge, our study is the first to examine the motivating science or technology, “narratives can affect perceptions of, factors and discussion topics of human-social chatbot and degrees of confidence in, potential applications and those interactions; therefore, only replication studies can assess the who are developing, promoting or opposing them” [57]. It is external validity of our results. Future studies should replicate important to note that this study cannot and does not establish this study using a larger, more representative sample of Replika a link between social chatbot usage and perceptions or narratives users.  Authors' Contributions VPT-J developed the study design, assisted with the creation of study materials, conducted data analysis, and wrote the manuscript. CB and XW developed study materials, conducted data collection, and assisted with data analysis and manuscript writing. ED assisted with data analysis. ICK and SDR assisted with data analysis and manuscript writing. AM and WMP assisted with manuscript writing. Conflicts of Interest None declared. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 9 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al Multimedia Appendix 1 Additional demographic information of respondents. [DOCX File , 9 KB-Multimedia Appendix 1] Multimedia Appendix 2 Checklist for Reporting Results of Internet E-Surveys (CHERRIES). [DOCX File , 8 KB-Multimedia Appendix 2] References 1. Seering J, Luria M, Kaufman G, Hammer J. Beyond dyadic interactions: considering chatbots as community members. 2019 May Presented at: Conference on Human Factors in Computing Systems (CHI) 2019; May 4-9; Glasgow, Scotland p. 4-9. [doi: 10.1145/3290605.3300680] 2. Shum H, He X, Li D. From Eliza to XiaoIce: challenges and opportunities with social chatbots. Frontiers Inf Technol Electronic Eng 2018 Jan 8;19(1):10-26. [doi: 10.1631/fitee.1700826] 3. Zhou L, Gao J, Li D, Shum H. The design and implementation of XiaoIce, an empathetic social chatbot. Comput Linguist 2020 Mar;46(1):53-93. [doi: 10.1162/coli_a_00368] 4. Balch O. AI and me: friendship chatbots are on the rise, but is there a gendered design flaw? The Guardian.: The Guardian; 2020 May 07. URL: https://www.theguardian.com/careers/2020/may/07/ ai-and-me-friendship-chatbots-are-on-the-rise-but-is-there-a-gendered-design-flaw [accessed 2022-04-10] 5. Metz C. Riding out quarantine with a chatbot friend: I feel very connected. New York Times. 2020 Jun 16. URL: https:/ /www.nytimes.com/2020/06/16/technology/chatbots-quarantine-Coronavirus.html [accessed 2022-04-10] 6. Muresan A, Pohl H. Chats with bots: balancing imitation and engagement. 2019 Apr Presented at: Conference on Human Factors in Computing Systems (CHI) 2019; May 4-9; Glasgow, Scotland p. 4-9. [doi: 10.1145/3290607.3313084] 7. Følstad A, Araujo T, Law E, Brandtzaeg P, Papadopoulos S, Reis L, et al. Future directions for chatbot research: an interdisciplinary research agenda. Computing 2021 Dec:2915-2945 [FREE Full text] [doi: 10.1007/s00607-021-01016-7] 8. Zarouali B, Van den Broeck E, Walrave M, Poels K. Predicting consumer responses to a chatbot on Facebook. Cyberpsychol Behav Soc Netw 2018 Aug;21(8):491-497. [doi: 10.1089/cyber.2017.0518] [Medline: 30036074] 9. Zierau N, Engel C, Söllner M, Leimeister JM. Trust in smart personal assistants: a systematic literature review and development of a research agenda. In: SSRN. 2020 Mar Presented at: International Conference on Wirtschaftsinformatik (WI); March 2020; Potsdam, Germany. [doi: 10.2139/ssrn.3920577] 10. Lee S, Choi J. Enhancing user experience with conversational agent for movie recommendation: effects of self-disclosure and reciprocity. Int J Hum Comput Stud 2017;103:95-105 [FREE Full text] [doi: 10.1016/j.ijhcs.2017.02.005] 11. Wechsung I, Naumann A, Möller S. The influence of the usage mode on subjectively perceived quality. 2010 Presented at: International Workshop on Spoken Dialogue Systems Technology; Oct 1; Berlin, Germany p. 188-193. [doi: 10.1007/978-3-642-16202-2_20] 12. Graham S, Huang J, Clark M, Helgeson V. The positives of negative emotions: willingness to express negative emotions promotes relationships. Pers Soc Psychol Bull 2008:394-406 [FREE Full text] [doi: 10.1177/0146167207311281] 13. Reis H, Shaver P. Intimacy as an interpersonal process. In: Handbook of Personal Relationships: Theory, Research and Interventions. New York, NY: John Wiley & Sons; 1988:367-389. 14. Simons J, Green M. Divisive topics as social threats. Commun Res 2018:165-187 [FREE Full text] [doi: 10.1177/0093650216644025] 15. Adamopoulou E, Moussiades L. Chatbots: history, technology, and applications. MLWA 2020 Dec;2:100006. [doi: 10.1016/j.mlwa.2020.100006] 16. Croes EAJ, Antheunis ML. Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. J Soc Pers Relat 2020 Sep 25;38(1):279-300. [doi: 10.1177/0265407520959463] 17. Heckman C, Wobbrock J. Put your best face forward: Anthropomorphic agents, e-commerce consumers, and the law. 2000 Presented at: 4th International Conference on Autonomous Agents; June 3; Barcelona, Spain. [doi: 10.1145/336595.337562] 18. Moon Y. Intimate exchanges: using computers to elicit self‐disclosure from consumers. J Consum Res 2000 Mar;26(4):323-339. [doi: 10.1086/209566] 19. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment Health 2017 Jun 06;4(2):e19 [FREE Full text] [doi: 10.2196/mental.7785] [Medline: 28588005] 20. Jøranson N, Pedersen I, Rokstad AMM, Ihlebæk C. Effects on symptoms of agitation and depression in persons with dementia participating in robot-assisted activity: a cluster-randomized controlled trial. J Am Med Dir Assoc 2015 Oct 01;16(10):867-873. [doi: 10.1016/j.jamda.2015.05.002] [Medline: 26096582] 21. Yu R, Hui E, Lee J, Poon D, Ng A, Sit K, et al. Use of a therapeutic, socially assistive pet robot (PARO) in improving mood and stimulating social interaction and communication for people with dementia: study protocol for a randomized controlled trial. JMIR Res Protoc 2015 May 01;4(2):e45 [FREE Full text] [doi: 10.2196/resprot.4189] [Medline: 25934173] https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 10 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al 22. Ta V, Griffith C, Boatfield C, Wang X, Civitello M, Bader H, et al. User experiences of social support from companion chatbots in everyday contexts: thematic analysis. J Med Internet Res 2020 Mar 06;22(3):e16235 [FREE Full text] [doi: 10.2196/16235] [Medline: 32141837] 23. Broadbent E, Garrett J, Jepsen N, Li OV, Ahn HS, Robinson H, et al. Using robots at home to support patients with chronic obstructive pulmonary disease: pilot randomized controlled trial. J Med Internet Res 2018 Feb 13;20(2):e45 [FREE Full text] [doi: 10.2196/jmir.8640] [Medline: 29439942] 24. Bates M. Health care chatbots are here to help. IEEE Pulse 2019 May;10(3):12-14. [doi: 10.1109/mpuls.2019.2911816] 25. Cameron G, Cameron D, Megaw G, Bond R, Mulvenna M, O'Neill S, et al. Best practices for designing chatbots in mental health care? A case study on iHelpr. In: Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI-2018). 2018 Presented at: HCI'18; May 10; Swindon, UK p. 1-5. [doi: 10.14236/ewic/hci2018.129] 26. Brandtzaeg P, Følstad A. Why people use chatbots. 2017 Presented at: 2017 International Conference on Internet Science; Nov 22-24; Thessaloniki, Greece p. 22-24 URL: https://link.springer.com/chapter/10.1007/978-3-319-70284-1_30#citeas [doi: 10.1007/978-3-319-70284-1_30] 27. Skjuve M, Følstad A, Fostervold KI, Brandtzaeg PB. My chatbot companion - a study of human-chatbot relationships. Int J Hum Comput Stud 2021 May;149:102601. [doi: 10.1016/j.ijhcs.2021.102601] 28. Aboud F, Mendelson M. Determinants of friendship selectionquality: developmental perspectives. In: The Company They Keep: Friendship in Childhood and Adolescence. Cambridge, UK: Cambridge University Press; 1996:87-112. 29. Argyle M, Henderson M. The rules of friendship. J Soc Pers Relat 2016 Jun 30;1(2):211-237. [doi: 10.1177/0265407584012005] 30. Blau P. Social exchange. In: International Encyclopedia of the Social Sciences. New York, NY: Macmillan; 1968:452-457. 31. Cook K, Rice E. Social exchange theory. In: Handbook of Social Psychology. New York, NY: Springer; 2003:53-76. 32. Baumeister RF, Leary MR. The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychol Bull 1995;117(3):497-529. [doi: 10.1037/0033-2909.117.3.497] 33. Swann WB, Pelham BW, Krull DS. Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification. J Pers Soc Psychol 1989;57(5):782-791. [doi: 10.1037/0022-3514.57.5.782] 34. Chen Z, Berger J. When, why, and how controversy causes conversation. J Consum Res 2013 Oct 01;40(3):580-593. [doi: 10.1086/671465] 35. Green T. Republicans and Democrats alike say it's stressful to talk politics with people who disagree. Pew Research Center. 2021 Nov 23. URL: https://www.pewresearch.org/fact-tank/2021/11/23/ republicans-and-democrats-alike-say-its-stressful-to-talk-politics-with-people-who-disagree/ [accessed 2022-10-04] 36. Addressing controversial issues. Council of Europe. URL: https://www.coe.int/en/web/campaign-free-to-speak-safe-to-learn/ addressing-controversial-issues [accessed 2022-04-10] 37. Clark D, Wells A. A cognitive model of social phobia. In: Social Phobia: Diagnosis, Assessment, and Treatment. New York, NY: The Guilford Press; 1995:69-93. 38. Hayes SC, Wilson KG, Gifford EV, Follette VM, et al. Experiential avoidance and behavioral disorders: a functional dimensional approach to diagnosis and treatment. J Consult Clin Psychol 1996;64(6):1152-1168. [doi: 10.1037//0022-006x.64.6.1152] 39. Prizant-Passal S, Shechner T, Aderka IM. Social anxiety and internet use – A meta-analysis: What do we know? What are we missing? Comput Human Behav 2016 Sep;62:221-229. [doi: 10.1016/j.chb.2016.04.003] 40. Stafford TF, Stafford MR, Schkade LL. Determining uses and gratifications for the internet. Decis Sci 2004 May;35(2):259-288. [doi: 10.1111/j.00117315.2004.02524.x] 41. Cheng Y, Jiang H. AI‐Powered mental health chatbots: examining users’ motivations, active communicative action and engagement after mass‐shooting disasters. J Contingencies Crisis Manag 2020 Sep 29;28(3):339-354. [doi: 10.1111/1468-5973.12319] 42. Lin Y, Fang C, Hsu C. Determining uses and gratifications for mobile phone apps. In: Future Information Technology. Berlin, Germany: Springer; 2014:661-668. 43. Replika. URL: https://replika.ai/about/story [accessed 2022-09-27] 44. Brown JJ, Reingen PH. Social ties and word-of-mouth referral behavior. J Consum Res 1987 Dec;14(3):350. [doi: 10.1086/209118] 45. Alderfer CP. An empirical test of a new theory of human needs. Organ Behav Hum Perform 1969 May;4(2):142-175. [doi: 10.1016/0030-5073(69)90004-x] 46. Aly A, Griffiths S, Stramandinoli F. Metrics and benchmarks in human-robot interaction: Recent advances in cognitive robotics. Cogn Syst Res 2017 Jun;43:313-323. [doi: 10.1016/j.cogsys.2016.06.002] 47. D'Alfonso S, Santesteban-Echarri O, Rice S, Wadley G, Lederman R, Miles C, et al. Artificial intelligence-assisted online social therapy for youth mental health. Front Psychol 2017 Jun 02;8. [doi: 10.3389/fpsyg.2017.00796] 48. Ryan GW, Bernard HR. Techniques to identify themes. Field Methods 2016 Jul 24;15(1):85-109. [doi: 10.1177/1525822x02239569] 49. Gabarron E, Larbi D, Denecke K, Årsand E. What do we know about the use of chatbots for public health? Stud Health Technol Inform 2020 Jun 16;270:796-800. [doi: 10.3233/SHTI200270] [Medline: 32570492] https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 11 (page number not for citation purposes) XSL FO RenderX JMIR HUMAN FACTORS Ta-Johnson et al 50. Kardas M, Kumar A, Epley N. Overly shallow?: Miscalibrated expectations create a barrier to deeper conversation. J Pers Soc Psychol 2022 Mar;122(3):367-398. [doi: 10.1037/pspa0000281] [Medline: 34591541] 51. Stein N, Brooks K. A fully automated conversational artificial intelligence for weight loss: longitudinal observational study among overweight and obese adults. JMIR Diabetes 2017 Nov 01;2(2):e28 [FREE Full text] [doi: 10.2196/diabetes.8590] [Medline: 30291087] 52. Park S, Choi J, Lee S, Oh C, Kim C, La S, et al. Designing a chatbot for a brief motivational interview on stress management: qualitative case study. J Med Internet Res 2019 Apr 16;21(4):e12231 [FREE Full text] [doi: 10.2196/12231] [Medline: 30990463] 53. Abd-Alrazaq A, Safi Z, Alajlani M, Warren J, Househ M, Denecke K. Technical metrics used to evaluate health care chatbots: scoping review. J Med Internet Res 2020 Jun 05;22(6):e18301 [FREE Full text] [doi: 10.2196/18301] [Medline: 32442157] 54. Prochaska JJ, Vogel EA, Chieng A, Kendra M, Baiocchi M, Pajarito S, et al. A therapeutic relational agent for reducing problematic substance use (Woebot): development and usability study. J Med Internet Res 2021 Mar 23;23(3):e24850 [FREE Full text] [doi: 10.2196/24850] [Medline: 33755028] 55. Prince M, Patel V, Saxena S, Maj M, Maselko J, Phillips MR, et al. No health without mental health. Lancet 2007 Sep;370(9590):859-877. [doi: 10.1016/S0140-6736(07)61238-0] 56. Horowitz MC, Kahn L. What influences attitudes about artificial intelligence adoption: Evidence from US local officials. PLoS One 2021 Oct 20;16(10):e0257732 [FREE Full text] [doi: 10.1371/journal.pone.0257732] [Medline: 34669734] 57. Cave S, Craig C, Dihal K, Dillon S, Montgomery J, Singler B, et al. Portrayals and perceptions of AI and why they matter. The Royal Society. 2018 Dec 11. URL: https://www.repository.cam.ac.uk/handle/1810/287193 [accessed 2022-04-10] Abbreviations CHERRIES: Checklist for Reporting Results of Internet E-Surveys ERG: Existence, Relatedness, and Growth Edited by A Kushniruk; submitted 19.04.22; peer-reviewed by C Thornson, D Chrimes; comments to author 28.06.22; revised version received 30.07.22; accepted 29.08.22; published 03.10.22 Please cite as: Ta-Johnson VP, Boatfield C, Wang X, DeCero E, Krupica IC, Rasof SD, Motzer A, Pedryc WM JMIR Hum Factors 2022;9(4):e38876 URL: https://humanfactors.jmir.org/2022/4/e38876 doi: 10.2196/38876 PMID: ©Vivian P Ta-Johnson, Carolynn Boatfield, Xinyu Wang, Esther DeCero, Isabel C Krupica, Sophie D Rasof, Amelie Motzer, Wiktoria M Pedryc. Originally published in JMIR Human Factors (https://humanfactors.jmir.org), 03.10.2022. This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included. https://humanfactors.jmir.org/2022/4/e38876 JMIR Hum Factors 2022 | vol. 9 | iss. 4 | e38876 | p. 12 (page number not for citation purposes) XSL FO RenderX

Journal

JMIR Human FactorsJMIR Publications

Published: Oct 3, 2022

Keywords: social chatbots; Replika; emotional chatbots; artificial intelligence; thematic analysis; human-chatbot interactions; chatbot; usability; interaction; human factors; motivation; topics; AI; perception; usage

There are no references for this article.