Pediatric Chatbot

Developing an AI chatbot that supports the communication of health information between children with cancer and their caregivers

01. Project Overview

The University of Michigan School of Information (UMSI) conducts cutting-edge research in a wide range of topics at the intersection of information, people, and technology. In the fall of my 2nd year of college, I joined the UMSI research faculty as a research assistant with Woosuk Seo as my mentor to help enhance pediatric healthcare.

Role

Research Assistant

Background

In pediatric care, effective communication between child patients (ages 6-12), parental caregivers, and healthcare providers is critical for treatment and recovery. However, caring for children patients require different communication strategies compared to adults. Thus, researchers at the University of Michigan School of Information are designing a chatbot that is used to bridge communication gaps in the pediatric healthcare context.

The Problem

There are many challenges when it comes to the effective communication of health-related topics with children. In 2021, our team interviewed children with cancer and their parents at the University of Michigan (CSCW'21). We discovered that children and their parents face three main challenges when it comes to communication.

Research Questions

Inspired by this potential of AI-driven chatbots in pediatric communication, we aimed to answer the following research questions:

Q1

How should we design AI-driven chatbots to support children’s communication needs of children with cancer?

What are the expected roles and potential challenges of such chatbots in supporting communication between children, parents, and healthcare providers?

Methodology

We followed a loose interpretation of the double diamond design process in order to develop the chatbot system.

Q2

01

🤔💭 Children and parents have different perspectives on life with cancer.

Parents feel “endless anxiety” about their child’s state of health, causing the parent to implement many restrictions on the child. The child does not fully understand the parent’s anxiety and do not realize the importance of restrictions.

“At his age of 4, he [P2] assumed that everybody had hair loss and everybody had moon face and everybody had hospital stays and ports.” -P2 Mom

Duration

Sept 2022 - Sept 2024

Tools

Figma, Illustrator, Miro, Atlas.ti

Team

Woosuk Seo (PhD Candidate), Sun Young Park (Associate Professor), Mark Ackerman (Professor), Miro, Atlas.ti

02

😡💬 There is a disconnect between the parent and child's preferred communication method.

Parents prefer to simplify difficult to understand medical terminology. However, children find this to be misleading and even manipulative, which can lead to mistrust in their relationship.

“We [CG7 and husband] pretty much tell her [P7] everything on an age appropriate level. But she [P7] does not like to be talked to like she’s a little kid.” -P7 Mom

03

🤕💭 There is a discrepancy between the parent's understanding of the child's emotions and their actual feelings.

Children choose to hide their true feelings from their parents to avoid being treated differently or prevent their parents from getting worried. Our findings revealed that they will conceal physical and emotional pain.

“If he brings about it, we talk about it or else we don’t.” -P3 Mom

02. User Interviews

We recruited 12 child-parent pairs to interview concurrently. The child participants were between the ages 6-12 and have received medical treatment for a chronic illness such as cancer and their parents identified as a primary caregiver. Each session was about 60 minutes. Each interview had two parts: a scenario-based interview and a design workshop.

Part I: Scenario-based Interviews

We started with a scenario-based approach by providing each pair with various scenarios that each demonstrated a different communication challenge. Particularly, we used the scenarios as prompts to explore how each dyad navigates the specific contexts and how they would mitigate the communication challenges. I illustrated the comics to be easy to understand, while being fun and engaging for the kids.

Part II: Design Workshop

After the scenario-based interviews, we introduced the participants to a design activity where the child-parent pairs collaborated to ideate various features of a chatbot that could enhance their communication and bridge any communication gaps. We provided the kids with colored pencils to imagine their ideal chatbot.

Data Analysis

The design sessions were audio recorded with the participant’s consent, and we used the transcripts to collect data that guided the development of the Child Bot and Expert Bot prototypes. Below are the main themes and patterns we discovered.

Children want a safe place where they can share their true emotions without judgement.

❤️ Emotional Outlet

Children just want to be listened to, reassured, and validated. They expect the chatbot to take on the role of a supportive friend, rather than a parent or doctor.

🫂 Emotional Support

Children expect the chatbot to be about the same age or slightly older. They also want the chatbot to have also undergone cancer treatment so it is easier for the chatbot to understand the child’s experience.

😊 Chatbot Persona

Children and parents expect to trust the chatbot’s ability to keep the conversations confidential. The chatbot should ask for consent from the child before sharing any information with parents or health proviers.

🙊 Consent & Confidentiality

Parents expect the chatbot to alert them if there is a serious problem, regardless of consent.

⚠️ Safety First

Children and parents both expect the chatbot to act as a middle man to help relay health-related information.

🤖 Chatbot Role

Parents want the chatbot to be appropriate for their child’s age. They do not expect the chatbot to be completely honest about medical treatments and procedures, especially if the child asks about treatments success rates and death. Instead, parents want to chatbot to redirect this type of conversation to a trusted adult.

⚖️ Appropriate Content

03. Prototype

Meet ChaCha! Our chatbot model, ChaCha, combines a state machine and large language models (LLMs). It was developed by research scientists at NAVER Cloud AI Lab in collaboration with members from our team.

ChaCha Conversation Flow

ChaCha is trained to carry free form conversations and support children when they share their emotions. ChaCha does this by 1) identifying an emotion the child has through conversation, 2) connecting the emotion to a reason, 3) empathizing with the emotion, and then 4) discussing the idea of having the child share their emotions with a trusted adult and help the child gain the courage to do so.

Adapting ChaCha

For this project’s purpose, ChaCha was trained to fit the pediatric healthcare context. We used ChaCha’s original language model to create two new Chatbot’s – Child Bot and Expert Bot.

Child Bot is trained to ask a child with cancer about their health concerns and provide medical-related advice.

Child Bot starts the conversation with the child by introducing itself as a child of the same age with the same diagnosis. The Child Bot also asks about hobbies to build rapport so that the child feels more comfortable.

Child Bot then uses a scenario-based approach to understand the child’s experience and discover any communication challenges the child has been experiencing.

The Child Bot provides guidance and resources to the child and encourages them to communicate with their caregivers.

🧒🏻 Child Bot

Expert Bot is trained to provide health insight and communication advice to caregivers of children with cancer.

Expert Bot starts the conversation by introducing itself to the caregiver as a chatbot that can help with pediatric healthcare.

Expert Bot then provides the caregiver with the same scenarios the child receives and helps the caregiver brainstorm different solutions to help the child from the caregiver prospective.

👩🏻 Expert Bot

Chatbot System

The chatbot system acts as a bridge between the child and parent. Children can use to Child Bot to gain personalized guidance and empower them to communicate health information with adults. Parents can use the chatbot to better understand their child’s perspective and receive personalized guidance and healthcare resources. Health providers can also use the chatbot data (AI generated summaries) to support clinical sessions.

04. Exploratory Testing

We recruited 15 health professionals (social workers, psychologists, therapists, and nurses) that have experience working with children ages 6-12 to review our chatbot system. These exploratory tests were conducted in a private room at the hospital, and each session lasted about 60 minutes. The exploratory tests aimed to identify the perspectives and expectations from pediatric healthcare experts for our prototype to support communication with child patients in pediatric care contexts.

Testing Procedure

The interviews involved an introduction, prototype review, and debriefing. Each participant freely chatted with the Child Bot and was asked to think aloud their impressions of the Child Bot’s responses during their interaction. After interacting with the Child Bot for 10-15 minutes, the participant conversed with the Expert Bot, pretending to be the parent of the child persona. Lastly, during the debrief, we asked questions about how our chatbot system could enhance child-parent communication and how it would support or hinder their practices. Through these questions, we aimed to explore the participants’ perspectives and potential concerns about AI-driven chatbots in pediatric care.

Data Analysis

We analyzed the transcripts to identify professionals’ perspectives and expectations. We coded the transcripts and organized the code by theme. Through group discussions, we compared, discussed, and revised the recurring themes until the agreements were reached.

Themes & Patterns

We encountered many recurring themes and patterns. Below are a few of the main themes and patterns we discovered.

Participants agree that the Child Bot provides children a safe place where they can express their feelings without judgement or bias. Children with cancer often are isolated from their peers and do not get the opportunity to share how they feel and have someone that can listen and help them feel less lonely.

❤️ Emotional Outlet

Participants find it impactful that the Child Bot takes on a persona of a pediatric cancer patient. This makes the child more comfortable to open up. The Child Bot also starts the conversation off by discussing hobbies – a good strategy to build a stronger connection, which leads to more open conversations.

🤝 Building Trust

The Child Bot is online 24/7, so it is available to chat whenever the child needs. The timing of addressing a child’s needs is critical, as they often experience emotional fluctuations throughout the day.

🕓 Accessibility

Some participants expressed concerns over a child becoming over-reliant on the Child Bot and could actually hinder a child’s communication with their parent. It is suggested to have the Child Bot set boundaries, such as a time limit.

⚠️ Over-Reliance

Some participants expressed concerns about the Child Bot not being able to appropriately handle a child’s emotions. It is crucial to draw out, address, and resolve a child’s emotions.

❤️‍🩹 Unresolved Emotions

All participants agreed that the Expert Bot should be helping parents reflect on their communication practices with their children. The Expert Bot needs to ask thought-provoking questions, which is important because parents rarely get the chance to think about communication issues. The scenario-based approach allows parents to see an unbiased perspective.

💭 Tool for Reflection

Participants believe that a benefit that parents can get from the Expert Bot is being able to reassure or improve their communication behaviors. Parents often seek for confirmation. Thus, it is important for the guidance from Expert Bot to be clear.

✅ Tool for Verification

The Expert Bot may cause parents to feel guilty if they have not been addressing their child’s pain appropriately.

😓 Intimidation

Healthcare providers can use the chatbot system to evaluate the communication practices of child-parent pairs. All participants viewed the chatbot system as a supplementary source of data for their practice. The chatbots can gather data to use as starting point for their clinical practices.

🔍 Tool for Assessment

05. Conclusion

Overall, our research has shown that there is a lot of potential and promise for the usage of AI chatbots for pediatric communication between children with chronic illnesses, their caregivers, and health providers. We believe that AI chatbots can reach the unmet needs of these stakeholders. We just wrote and submitted our research paper to CHI 2025. It is currently under review for publishing. Fingers crossed! Here is a link to the pdf, which includes much more info on our process and findings.

Rose

I am really proud of my contributions to this research project. I created a lot of the interview materials, such as the scenario illustrations, that led to these interviews being extremely successful. We were able to collect a lot of qualitative data on the needs, preferences, and expectations of this chatbot from children with cancer and their parents that guided the development of the chatbot prototype. I am also extremely excited about being able to be listed as a contributor in the paper. Hopefully this can be my first published research journal!

Thorn

I found that designing interviews for our user base was challenging. Because we were working with kids, we struggled with how to communicate more complex and serious concepts to them in an understandable and engaging way. Many of them did not really grasp the concept of effective communication, which led us to designing a bunch of comic-style scenarios to breakdown the communication challenges into its most simplest form. To add on, many of our participants had disabilities, such as cognitive delay, so we wanted to be able to support the child the best throughout the interview.

Bud

This project was the first time I conducted user interviews in the real world. At first, I was only taking notes during the interviews, but after a couple of interviews, my mentor had me take the lead as the facilitator. I was extremely nervous to guide the conversations, especially with children, but it turned out to go pretty well. However, there is still a lot of room for improvement on how I conduct interviews like making conversation shifts and follow-up questions feel more natural. I definitely think the hardest part is being able to think of meaningful follow-up questions on the spot.