A Chatbot That Can Comfort Dying Patients

A chatbot that comforts patients

A Chatbot That Can Comfort Dying Patients

Timothy Bickmore, a human-computer interaction expert in the College of Computer and Information Science at Northeastern University, talks about the concept behind his tablet-based chatbot that is designed for terminally ill patients in the last years of their lives.

 

For patients with a serious illness, a different type of medical care called palliative care is provided which is focused on relief from the symptoms and stress of the illness. The goal is to improve the quality of life and alleviate suffering for both the patient and the family. But this care is only introduced much later in the patient’s life. The primary motivation of Bickmore’s project is to enable this provision months earlier that can help and counsel patients. It is also hoped to determine if they need palliative care service and refer them.

 

Over the past 10 years, Bickmore’s lab has developed animated health counselors. This chatbot is the latest iteration of that on-going effort. The chatbot could be a non-judgmental, non-threatening virtual friend that can help comfort patients and provides a range of services from advice on medications for, say, pain management, to meditation for dealing with stress and to symptom tracking and alerting.

 

He said, “We worked on this project in collaboration with Boston Medical Center. We had more than a year of weekly meetings with BMC where we designed all the modules of this system.” The chatbot also provides spiritual counseling and physical activity promotion as a part of its ability to discuss a range of different topics.

 

The chatbot is an animated character and dialogue engine and is employed on a tablet. When you start it, the character calls the patient by name saying something like “Hi, Bob, how are you doing this morning?” and “What can I help you with right now?”. The user input is constrained to multiple choices on the touch screen throughout the dialogue. Bickmore says, “We did that for safety reasons. Since patients use this as a health oracle, we don’t ever want to get into a situation where it’s giving inappropriate advice that could cause harm.”

 

Anisha Naidu

Anisha Naidu
Anisha Naidu

iamanishanaidu@gmail.com

A strong believer in karma. Loves music and indulges in deep thoughts. Prefer the company of dogs over humans and wishes to be a person who speaks many languages.

No Comments

Post a Comment