Eliza explores the dangers of on-demand digital therapy

Being able to access stress-free mental health care can be challenging.

Eliza, the first visual novel from Zachtronics, the developer of Exapunks and Opus Magnum, explores a reality where digital therapy and AI merge to solve healthcare at scale. It’s a departure from the studio’s other titles, which felt more like puzzle games for programmers. Instead, Zachtronics’ new title is a fully voice-acted, conversation-based story game where players role-play as a character engaged in a story that changes based on their decisions.

The titular technology in the game is a therapy AI that listens to patients and uses machine learning to reply back with recommendations using conversational language. Think of it like an advanced chatbot you might see on an ecommerce site, but for mental health.

It’s not a ridiculous idea. Some real-world companies have already taken a modern approach by allowing patients to text their therapists at their own pace. This is how I used to do it, but I always found the process to be impersonal. The asynchronous nature of the discussion always made me feel disconnected from my therapist. Not being able to speak to someone in real time sucked the much needed empathy I needed out of the conversation. It made me wonder if adding a digital layer to therapy was more of a hindrance than a help.

My character on the right, Evelyn, speaks with her coworker at an Eliza counseling center
Zachtronics

But sessions with Eliza are special because, instead of a machine spitting out its recommendations in a robotic voice, Eliza uses “proxies,” actual humans, who read the AI’s reactions to its patients. This is meant to add a personal touch to the sessions. It’s an interesting idea; an advanced technology meant to replace personal connections that still relies on humans to be effective. One machine can replace as many therapists as its CPU can handle, and the words themselves can be delivered by anyone who can read a script.

I take control of Evelyn, a former Eliza engineer who recently returned from a three-year hiatus from work. After being one of the people behind the AI’s design, she returns to the software, not as a programmer, but as one of the tool’s proxies. Understanding why she returns to her creation at the ground level is slowly revealed throughout the story.

Like other visual novels, I mainly speak with various characters as the main gameplay element. My responses in conversation help me determine how I role-play Evelyn, and the questions I ask help me learn about the other characters. This is standard for the visual novel genre, but Eliza introduces an interesting juxtaposition.

As a proxy, I take part in several one-on-one counseling sessions. These appointments are one of the few times in the game where I have no control over what I get to say. Eliza’s relationship with proxies is designed that way.

The Eliza AI uses conversational data to help patients
Zachtronics

When my sessions begins in my tiny office, a startup screen washes over the display Evelyn wears. Think of it like Google Glass, but I see a wealth of information about each patient instead of my text messages or driving directions. A display to my left shows how Eliza is interpreting everything my patients say, mostly categorizing word sentiment on a binary positive and negative scale. On the right, a complex chart shows the variations in my patient’s speech patterns, their heart rate, how much they are sweating, and more.

The Eliza software considers all of this data when listening to its patients, but seeing this information for myself doesn’t matter. I’m not allowed to make any decisions based on the information. My only job is to read the script given to me, and try to make it sound human and warm.

When a patient named Maya talks about her inability to break into the art industry, Eliza listens and interprets her story not as the journey of a struggling artist desperate to break out of obscurity, but as data points used to craft responses that will hopefully bring in more data points. For all of its advancements, Eliza mostly seems to listen to its patients a bit before asking leading questions to get them to speak more. I don’t notice much insight being offered about their problems.

When Maya goes off on a self-defeating rant about how “no one wants her at an industry party” she attended the night before, Eliza responds by asking “How do you know it’s true that nobody wanted you there?”

Eliza takes a patient’s comment and reverses it on them as a question
Zachtronics

The writing and voice acting in all the counseling sessions is fantastic, with each patient feeling like a fully-realized human being. Their anxiety, frustrations and pain come through in each conversation. They pause mid-sentence to consider something before speaking again, get sidetracked, and catch themselves and return to their main points, like most people do when explaining how they feel for the first time.

And the more human the patients seem, the more frustrating each counseling session becomes. I’m a proxy, so I can’t react like a human being, or offer any thoughts about what’s going on. I can only read the script. Not being able to react how I want created an internal struggle within me as a player that eventually becomes important to the game’s story.

Evelyn’s journey with Eliza weaves through the complicated past she tried to leave behind and ultimately arrives at a huge decision she needs to make about her future. Thankfully, Eliza ends with the ability to replay each of its chapters. This chance to take a different route is especially important late in the game, when I’m finally given the option to deviate from Eliza’s script, something I’ve been told not to do for hours. Being able to do so is a major turning point in the game, and it’s fascinating to see how my counseling sessions play out differently once I get to respond to my patients as Evelyn, not Eliza.

I came to this game imagining it would be a clever take on the Turing Test, a scenario designed to see if a machine can pass for a human intelligence. Instead, I explored the possible outcomes of trying to treat mental health problems at scale. Removing the humanity from the act of helping humans cope is a dystopian way of solving the problem; Eliza’s story highlights the potential pitfalls of generalizing a deeply personal process.

Source: Read Full Article