Can AI help future social workers enhance their counselling skills?
Categories: Amar Ghelani, Faculty, Programs + Teaching
From left to right: Master of Social Work students Aiisha Rishi, Priyanka Mahey, Kyra M., and Sucdi Addow, with Assistant Professor Amar Ghelani
The client’s name is Alex. A senior in high school, Alex is stressed about mounting assignments, university applications and meeting their parents’ expectations of getting into a top school.
“Everyone keeps talking about where they’re applying and what scores they got,” Alex writes to their social worker in an online chat. “My parents really want me to get into a good school, and I do too, but it’s like the harder I try, the more I freak out. Sometimes I just… don’t even start things because I’m scared I’ll mess them up.”
If you were a social worker, how would you respond?
For social work students new to counselling, considering what to say next can be a challenge. The good news is, they don’t have to worry about saying the wrong thing. Alex is a simulated AI client created through a prompt fed into Microsoft Copilot, a generative AI-powered tool that all University of Toronto students, faculty and staff can access through their U of T accounts.
The idea of creating a simulated client that students could interact with in the classroom came to Amar Ghelani one day after thinking about the growing use of online chat-based counselling services. Kids Help Phone, for example, now offers young people the option of reaching out to a counsellor (a real one, not a chat bot) through a secure messaging system. Other crisis lines have been following suit. The Centre for Addiction and Mental Health (CAMH) recently posted job opportunities for social workers who could support individuals from marginalized communities who would prefer to interact through an online chat or text messaging format.
“My thought was ‘how do we teach students to engage with clients through text?’” said Ghelani, an Assistant Professor, Teaching Stream at the University of Toronto’s Factor-Inwentash Faculty of Social Work (FIFSW). “And then I thought ‘Copilot is very capable. Could we get it to simulate a client, like a young person in crisis, that students could engage with in class?”
The answer, he discovered with help from colleagues across the university and his students, is yes.
Master of Social Work student Priyanka Mahey said the idea of counselling an AI bot sounded unnatural at first.
“Walking into it, I didn’t really think that I would learn anything or be engaged in the process,” she said. “But after the first five minutes, I found myself asking more questions, and I felt like I was talking to an actual individual.”
Ghelani and course instructors piloted the program across three sections of the Social Work Practice in Mental Health course, marking the first time an AI simulated client has been used as a teaching tool in U of T’s social work program. The exercise provides a new twist on simulated learning, an experiential program that typically engages actors to play the role of clients that students can practice their clinical skills with. The Toronto Simulation Model, developed at the Faculty by the late Marion Bogo, is revered and used worldwide.
Ghelani is now focused on evaluating the effectiveness of the AI-based simulation he used in class. Professor Eunjung Lee and Michael Cournoyea, Assistant Professor, Teaching Stream with the Health Sciences Writing Centre, provided support with design and implementation along the way.
“One interesting thing about chat bots is that students can go home and try it out themselves,” says Cournoyea, who also once played the role of a simulated client himself for in-person simulations. “So there’s really an opportunity, in any moment, whenever they have the capacity, to test out their skills in interviewing, and to see what that might be like.”
To create a simulated client, Ghelani and his collaborators first used AI to help them put together a detailed case study for 17-year-old Alex — the fabricated high school student struggling with anxiety. They then asked Copilot to use the information from the case study to help them write a detailed prompt that each student could use to start the conversation.
While all the students started with the same prompt, their conversations with Alex went in different directions based on the questions that they subsequently asked. For 20 minutes, each student worked independently on their own laptops, immersed in therapeutic engagement and assessment, as instructors supervised and provided guidance.
Prior to engaging in the simulation, the students in Ghelani’s class learned about anxiety: what it is and different ways to assess it. Putting these learnings into practice through simulation helped them better understand anxiety in context, he says. “Through the simulation, they were really able to examine the situations that contributed to the simulated client’s anxiety, what anxiety meant to them, the behaviours, the physical feelings, and the thoughts related to it.”
When the twenty minutes were up, the students got into groups of three or four to discuss their experiences, guided by four short questions that their instructor provided: How did you know Alex was distressed? What assessment techniques were effective? What did you learn about your counselling approach? And importantly, how did you feel during the simulation?
The small group discussions later gave way to a broader class discussion, during which the students were asked to provide feedback on their general experience of the exercise and whether or not they found it helpful.
“I found it to be really good in terms of practicing how to ask the right questions,” said Sucdi Addow, who was in Ghelani’s class. “We were practicing how to do a mental status exam, and it was helpful to pull from the kind of questions that are part of the exam and ask them directly in a way that felt natural.”
Kyra M., another student in the class, described the experience as “pretty low stakes, overall. It was a good way to practice skills without being overly concerned about how you’re going to do it.”
The biggest critique the students had was that even though the prompt asked the AI to be resistant when sharing information (as a teenager with anxiety seeking help from a stranger might be), “Alex” was particularly forthcoming, providing paragraph-long, detailed responses. The information that the AI chat bot shared was realistic; the ease with which it shared it was not. “A real young person might just provide one-word answers. Or it might take them five minutes to respond. Copilot was very quick and detailed,” said Ghelani. “That’s something we would try to fix in the future.”
But even with that issue, Ghelani says the experience still provided the opportunity for students to practice their social work skills. “My hope is that the experience can help enhance their assessment competencies while encouraging them to think deeply about the questions they ask their clients,” he says. “And hopefully it can spark some discussion about the ethics of using AI in social work education and practice”
So what is next? Ghelani hopes to follow up with a proper study to assess whether or not the exercise can have a positive impact on students’ confidence and competence to assess anxiety and depression.
More broadly, as AI usage grows among students in general, Ghelani is keen to discover how it can be used to enhance learning rather than undermine it. “I’m hoping that if students are using AI, they’re using it productively and in a way that improves their critical thinking abilities,” he says. “This technology is not going anywhere. It’s just going to get more capable and more pervasive, and the people who are using it are going to advance in their AI skills much more quickly.”