.

  • Written by Laken Brooks, Doctoral Student of English, University of Florida

“How does that make you feel?”

In the isolation of the COVID-19 pandemic, many people are missing a sympathetic ear. Would a response like that make you feel heard, less alone, even if it were a machine writing back to you?

The pandemic has contributed to chronic loneliness[1]. Digital tools like video chat and social media help connect people who live or quarantine far apart. But when those friends or family members are not readily available, artificial intelligence can step in.

Millions of isolated people have found comfort[2] by chatting with an AI bot. Therapeutic bots have improved users’ mental health for decades[3]. Now, psychiatrists are studying how these AI companions can improve mental wellness during the pandemic and beyond.

How AI became a therapy tool

Artificial intelligence systems are computer programs that can perform tasks that people would normally do, like translating languages or recognizing objects in images. AI chatbots are programs that simulate human conversation. They have become common in customer service[4] because they can provide quick answers to basic questions.

The first chatbot was modeled on mental health practitioners. In 1966, computer scientist Joseph Weizenbaum created ELIZA[5], which he programmed to sound like a Rogerian psychotherapist[6]. Rogerian approaches encouraged psychotherapists to ask open-ended questions, often mirroring patients’ phrases back to them to encourage the patients to elaborate. Weizenbaum did not expect that his psychotherapist-like AI could have any therapeutic benefit for users. Training ELIZA to translate users’ comments into questions was merely a practical, if not ironic, model for the AI’s dialogue.

Weizenbaum was amazed when his test subjects actually confided in ELIZA[7] as they would a flesh-and-blood psychotherapist. Many study participants believed that they were sharing vulnerable thoughts with a live person[8]. Some of these participants refused to believe that the seemingly attentive ELIZA, who asked so many questions during each conversation, was actually a computer.

However, ELIZA did not need to trick users to help them. Even Weizenbaum’s secretary, who knew that ELIZA was a computer program, asked for privacy[9] so she could have her own personal conversations with the chatbot.

In the decades since ELIZA stunned its inventor, computer scientists have worked with medical professionals to explore how AI can support mental health. Some of the biggest therapy bots in the business have astounding reach, especially during times of sociopolitical uncertainty[10], when people tend to report higher levels of isolation and fatigue.

Since the COVID-19 pandemic struck, the demand for telehealth options, including AI chatbots, has skyrocketed[11]. Replika is an app famous for its lifelike, customizable avatars, and it has reported a 35% increase in traffic[12]. With mental health facilities overwhelmed with weekslong waitlists[13], millions of people are supplementing their mental health routines with therapy chatbots.

As mental wellness needs have changed over time, coders and therapists collaborate to build new AIs[14] that can meet these new challenges.

A woman texts on her phone. Millions of people have downloaded AI therapy apps during the COVID-19 pandemic. Jhaymesisviphotography/flickr

The digital doctor is in

How can a chatbot seem so human? If you were to dissect an AI, you would find algorithms and scripts: rules, essentially, that humans use to direct the AI’s behavior. With chatbots, coders train the AI to automatically produce certain phrases in response to a user’s message. Coders then work with writers to determine what kind of punctuation, emojis and other stylistic elements the bot will use.

These scripts ultimately provide a sense of the bot’s “attitude.” For example, a coder can train an AI to recognize the word “depressed” so that, whenever a user types a phrase like “I am feeling tired and depressed today,” the chatbot may respond with “I hear that you are feeling depressed. Can you explain why?” Or a writer may code the bot to produce a more colloquial tone: “Wow, I’m sorry you’re feeling this way. Why do you think you might be feeling depressed?”

These scripts replicate a common tactic in cognitive behavioral therapy: asking questions[15]. AI therapy bots encourage people to vent frustrations and then ask them to more closely reflect on those experiences or emotions. Even when an AI’s responses are broad or unspecific, a patient may find the process of typing out their thoughts to someone – even an artificial “someone” – cathartic[16].

Who benefits and how

Do chatbots actually work to relieve loneliness or anxiety? More research is needed, but it seems so[17]. Several studies provide promising results[18]. For example, young adults who regularly messaged a therapy chatbot reported less loneliness and anxiety[19] than did their peers who did not use the AI. Elderly users may also benefit[20] from communicating with chatbots, especially if those elders live alone or do not have regular contact with loved ones.

A chatbot’s therapeutic power – and its Achilles’ heel – is its script. The dialogue is predetermined, the same lines delivered to multiple users. These scripted responses allow a chatbot to communicate with numerous users simultaneously. Chatbots are especially helpful for people who want to express themselves quickly and anonymously, without judgment. Users can immediately pull up a chatbot to offload stress from their day when they may not want or be able to share such thoughts with family or friends.

However, these same scripts prevent AI from being a serious replacement for human therapists. AI bots respond to certain keywords, so they sometimes misunderstand users. When Vice tested the popular therapy app Woebot, the app produced a cringeworthy response[21].

User: “I’m super anxious and can barely sleep.”

Woebot: “Ah, I can’t wait to hop into my jammies later” followed by a series of sleepy “z” emojis

Would a user in the throes of a panic attack find this scripted comment helpful or comforting? Not likely. But unlike human therapists, AI bots are not good at interpreting social context or intervening in a crisis. While an AI may seem lifelike, it isn’t always an appropriate tool[22] to use when someone’s life is on the line. Unlike trained crisis counselors, chatbots cannot recommend specific safety plans or connect users with health resources and support in their community.

Despite these real limitations, AI chatbots provide a much-needed platform for open communication and self-expression. With therapy apps like Replika[23], Tess[24] and Woebot[25] raking in millions in funding and user downloads, people have more options than ever if they want to try chatting with a bot to process their emotions between therapy appointments or to make a digital friend during a pandemic[26].

References

  1. ^ chronic loneliness (time.com)
  2. ^ found comfort (www.cnn.com)
  3. ^ improved users’ mental health for decades (doi.org)
  4. ^ common in customer service (www.businessinsider.com)
  5. ^ ELIZA (archive.org)
  6. ^ Rogerian psychotherapist (www.health.harvard.edu)
  7. ^ confided in ELIZA (confluence.gallatin.nyu.edu)
  8. ^ a live person (cse.buffalo.edu)
  9. ^ asked for privacy (spectrum.ieee.org)
  10. ^ sociopolitical uncertainty (www.healthline.com)
  11. ^ has skyrocketed (www.vox.com)
  12. ^ a 35% increase in traffic (www.theguardian.com)
  13. ^ overwhelmed with weekslong waitlists (www.cnn.com)
  14. ^ build new AIs (www.diygenius.com)
  15. ^ asking questions (emerj.com)
  16. ^ even an artificial “someone” – cathartic (www.wired.com)
  17. ^ but it seems so (doi.org)
  18. ^ promising results (doi.org)
  19. ^ reported less loneliness and anxiety (dx.doi.org)
  20. ^ Elderly users may also benefit (doi.org)
  21. ^ produced a cringeworthy response (youtu.be)
  22. ^ it isn’t always an appropriate tool (www.utsa.edu)
  23. ^ Replika (replika.ai)
  24. ^ Tess (www.x2ai.com)
  25. ^ Woebot (woebothealth.com)
  26. ^ make a digital friend during a pandemic (www.nytimes.com)

Authors: Laken Brooks, Doctoral Student of English, University of Florida

Read more https://theconversation.com/covid-19-has-made-americans-lonelier-than-ever-heres-how-ai-can-help-152445

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more