The Loop by Twill

How Conversational AI Can Help Engage Hard-to-Reach Populations like Medicaid

Written by Ran Zilca | April 20, 2022 at 7:43 PM

A growing number of artificial intelligence (AI) tools are making their way into patient care, causing clinicians and health technology experts to ask questions about how they were programmed, and what they are capable of. 

The White House’s State of the Union pledge to invest more than $6B in mental health services to combat depression, with an emphasis on populations that have been historically hard to engage, like Medicaid, presents new opportunities for AI chatbots to play a larger and more impactful role in providing mental healthcare. 

The Mental Health Inflection Point, Lack of Providers, & Need for Therapeutic AI

The global demand for mental healthcare is so severe that President Biden pledged more

 than $6 billion in his 2023 budget to bolster mental health support services in the U.S. 

The administration emphasized their focus on serving “Black and Brown communities [who] are disproportionately undertreated even as their burden of mental illness has continued to rise.” These groups have higher levels of unrecognized depression, and account for more than half (61%) of all those covered by the Medicaid insurance program. 

With the demand for mental health services at an all-time high, and providers already strained, the need for clinically-trained AI to provide behavioral health support has never been more urgent.    

 A paper I co-wrote with the Twill Labs team details the impact that our AI-powered therapeutic assistant, Anna, had on users. The results and implications are both promising and inspiring: 

  • During a therapeutic interaction on Twill, people wrote twice as much and what they wrote showed greater adherence to the activity’s intention when Anna was guiding them through the activity as opposed to when Anna was not present.

  • People who interacted with Anna made a connection with her; 3 out of 4 users said Anna listened to them, was curious about them, and gave them useful insights.

  • Studies have shown some people feel more comfortable confiding in a bot than a human because they feel less judged. This is critical when considering how to engage populations that have traditionally had a difficult time accessing care and distrust the healthcare system; they stand to benefit the most from AI-based therapeutic agents like Anna who can build trust and personalize the care journey.  

The Chatbot (R)evolution

Artificial Intelligence in health tech is nothing new; it has been implemented across numerous software-enabled platforms to enable everything from content personalization and diagnostic tools to digitized phenotyping and biomarker/symptom monitoring.

The desire to build a conversational AI agent has always been there, but the technology to support it only matured very recently. Back in the early 2000s, when I worked at the speech and natural language group at IBM Research, we very frequently discussed the notion of creating “helping conversations” with machines. In 2009, as founder/CEO of a company that released the first positive psychology app, I wrote two blog posts about this topic in Psychology Today.

 In the early 2010s, a series of breakthrough algorithms were introduced by the deep learning research community that finally made it possible to understand natural language in-depth, and to generate natural language responses. This was the innovation that made it possible to create what we now call chatbots.

But are people willing to be vulnerable and talk about sensitive issues with a bot? Surprisingly, the research says yes. Compared to human beings, chatbots are perceived as less judgmental; in fact, some people prefer to interact with chatbots over mental health professionals, which is especially important for populations that may be reluctant to seek out in-person therapy.

 Creating Anna & the Importance of the Therapeutic Alliance 

When I joined Twill in 2015, I had already developed a prototype and filed a patent for a conversational AI agent called “Liz” and immediately got to work building Anna. I envisioned Anna both forming a true therapeutic alliance with users and delivering interventions effectively through adherence fidelity in order to maximize benefits and improve outcomes.

 The most potent ingredient in talk therapy is the therapeutic alliance—the bond formed between therapist or coach and the client. We recognized very early on that this is the biggest challenge and also the biggest opportunity in designing Anna, so we focused much of our energy there. 

We recognized early on that what could be borrowed from the research on therapeutic alliance between two humans was limited, because the human-to-machine relationship is different. 

 As a starting point, we defined the following statements we'd like users to say —Anna 1) listens to me; 2) cares about me; 3) is curious about me; and 4) knows me well.

 We are now in the midst of a study to quantify the essential components of human-to-machine therapeutic alliance by analyzing an exhaustive list of possible relationship characteristics.

Therapeutic Intelligence 

Anna also had to be trained in the multiple therapeutic disciplines that Twill’s interventions are drawn from. Her dialogues are authored by a large transdisciplinary team of clinicians, scientists, psychotherapists, and chronic disease experts across the company. This is why when we talk about Anna’s capabilities, “Therapeutic Intelligence” is more precise than AI. 

"AI" is too broad, and "empathic AI" or "artificial emotional intelligence" are too narrow. “Therapeutic Intelligence” encapsulates the full cycle of health: prevention, promotion, recovery, and treatment, and also describes the nature of the relationship Anna facilliates (a therapeutic alliance).

We launched Anna on Twill’s consumer platform in late 2019, incorporating her into some of the most popular tracks (4-week self-guided modules). The next step was to find out from users how they felt about her and if their engagement on the platform changed as a result. 

 Initial Perceptions of Anna & Her Impact on Engagement

In the paper authored by the AI & Research teams at Twill, we share findings from a pilot test we ran on users interacting with Anna. 

We were excited to see that 89.6% of the 203 surveyed users rated Anna as helpful. When these users were asked to evaluate Anna on a series of attributes, these were the results:

  • 74.9% agreed/strongly agreed with the statement “Anna listens to me” 
  • 73.3% agreed/strongly agreed with the statement “Anna is curious about me
  • 76.8% agreed/strongly agreed with the statement “Anna gives me insights that I can use

In another pilot study, we explored how Anna influences engagement within Twill activities. We found that participants who received versions of Twill activities led by Anna wrote lengthier, more elaborate responses than those who completed the activities without her guidance. These are preliminary results, but still promising and incredibly useful as we look to further optimize Anna and expand her capabilities across the Twill platform. 

Research Implications 

The mental health fallout from the pandemic, coupled with the provider shortage, has increased demand for digital mental health solutions. Unfortunately, many medical and health apps suffer from low engagement, with retention after 30 days in the low single digits

 The existing literature, which we review in the paper, suggests that AI, and AI conversational agents in particular, can have dramatic effects on engagement with DMHIs. AI-based agents like Anna can add a deep level of personalization that comes from being familiar with users’ lives over time. When a user shares details with Anna, like the name of a spouse or a favorite outdoor activity, she remembers and references them appropriately in subsequent conversations. This is just one relatively simple example of how Anna forms relationships with users, which can drive sustained engagement.

 For groups of people that have been historically difficult to activate and/or are likely to miss or ignore their depressive symptoms, the implications here are potentially game-changing. 

Consider the fact that Twill’s analysis of the National Health and Wellness Survey found that one in six people has unrecognized depression, and prevalence of depression is higher among BIPOC and other non-white ethnic groups. In the Medicaid population, 34%, or more than a third, have undiagnosed mild to severe symptoms of depression.

The average person will spend eight years living with depressive symptoms before they address them. Many individuals on Medicaid have major trust issues when it comes to the healthcare system and may not know they have access to mental health services. 

 If destigmatized, AI-powered solutions can reach some of these people, engage them sooner and triage them to the right level of care, we could potentially save a lot of suffering and relieve some of the burden on overwhelmed clinicians.

 Anna 2.0

We are currently working to deploy Anna more broadly—embedding her not just within tracks, but throughout the Twill user experience, beginning at onboarding. We’re also constantly adding new “ears” to Anna so her listening skills can generate more reflective and attuned responses. This will make her more sensitive, and able to pick up the subtle ways users express themselves, and the psychological cues that can be inferred from their input.  

We’re also contemplating Anna's unique, expressive voice and how we represent her to the different people she'll interact with. That includes teaching her multiple languages (she is very close to speaking LatAm Spanish!) and considering how she incorporates non-verbal cues, like facial expressions and vocal tone subtleties.  

 Those types of capabilities can only enhance the depth of the therapeutic alliance Anna forms with people as she guides them towards healthier behaviors.

 Our preliminary data around Anna aligns with the existing research on chatbots in healthcare settings. We believe Anna (and other therapeutic agents like her) may be the key to high activation and retention on digital health platforms, especially in hard-to-reach populations like Medicaid. Investment in therapeutic AI is a crucial component in addressing the global mental health pandemic and bridging care gaps that impact our most vulnerable communities.