Promoting Excellence In Psychological Health & Wellbeing

Reflecting on AI and its use in CBT

21 Aug 25

Psychological Wellbeing Practitioner and trainee Health Psychologist Antonia Ames reflects on her work using generative AI as part of delivering CBT

Antonia is a qualified PWP and trainee health psychologist, currently working to design and deliver psychological training across her local NHS Trust. She has also worked clinically with ieso.

 

As a millennial, I grew up in the era of emerging technology – as a young teen I learnt HTML coding on Myspace, and I excitedly compared the first smartphones with my friends at school.

Since I entered mental health care we’ve gone from paper notes to iPads on wards. When I started work as a PWP (Psychological Wellbeing Practitioner) in 2015, we would manually print or photocopy booklets for the majority of patients.

Technology has changed a LOT since then, and even more so in the last few years since AI emerged. I find myself once again, excited about the potential but apprehensive about the appropriate use.

Here I reflect on my use of it so far, and my thoughts on its future.

I started work with ieso, a private company providing text-based cognitive behavioural therapy (CBT), in 2022, interested in using my PWP skills in a new way. Actually, the basic work with ieso was more similar to standard PWP work than I expected. Instead of verbally asking patients how they were, or discussing an intervention, I’d type it. Live typed therapy allows a different way of accessing help that really benefits some people.

In 2023 I joined an exciting pilot – patients would be offered the opportunity to use an app for anxiety, which notably included an AI chat-bot. I was to assess patients and discuss the app as a potential intervention. Participants were then assigned to the app and I would contact them again several weeks later once the app work was done. They had contact in the interim with study staff but in terms of my work it was much more ‘hands-off’.

People liked the content of the app but found the AI part clunky and frustrating. It didn’t ‘get’ what they were saying. This is echoed in the results (1) – clinical outcomes are good but interviews do comment on some of the difficulties.

I left ieso in 2024 when I commenced my doctorate, but I was still interested to keep my ear to the ground about emerging technologies, especially AI in therapy. After this I started working with Limbic, a company that offers AI assistants for patient referral and assessment, and guided CBT interventions.

I’ve taken part in various tasks with them, acting as both a PWP and patient, to test AI at different points. Some were a triage service, whilst others were closer to a full assessment or session with the AI agent.  I’ve found them really interesting, especially taking the role of the patient.

I recall in the early days of my PWP training, learning about ‘empathy dots’. These can be used when you are concentrating so hard on the process, that your natural empathy is pushed aside. An empathy dot is a mark on your assessment sheet that reminds you to express empathy at various stages. It sounds very forced (because it is) but works well whilst you learn, until you are able to both do the work and be an empathetic person.

The AI responses at times reminded me of these – they were expressing empathy but I didn’t always believe it. It felt false.

It also sometimes misunderstood what I was saying. If people are making their first contact with mental health services it is important they are responded to kindly and accurately. We have all probably felt the frustration when an automated phone service doesn’t understand your response correctly. If an AI fails to understand a patient, it could lead to this frustration, potentially worsening risk or someone disconnecting from services.

An alternative space for AI is in administrative tasks. PWPs like other healthcare staff, are extremely busy and need excellent organisational skills to manage their heavy caseloads.  I know that many people are already using AI to help their personal organisation in many ways and it is now being explored professionally, for example Microsoft Co-Pilot to summarise meeting notes.

It can be a fabulously useful tool especially for staff who are neurodivergent or may for other reasons struggle with the ‘organisation’ of the role.

However, we must consider the safety of data, particularly storage and access. Unless a specific AI tool has been checked and approved by your appropriate governing bodies, we shouldn’t be inputting clinical data into it. It is important to reflect on the pros, cons and ethical issues in using AI in the different areas of your work, and to encourage discussion amongst your teams about how you are appropriately using AI to support your work. Keeping up with updates in organisational advice and the latest research in the area can also help to ensure safe and effective use of these new tools.

 

If you are interested in joining the conversation about the use of generative AI in the psychological professions, register for our PPN South West event on the 16th of September here

 

1 Palmer, C. E., Marshall, E., Millgate, E., Warren, G., Ewbank, M., Cooper, E., ... & Blackwell, A. D. (2025). Combining Artificial Intelligence and Human Support in Mental Health: Digital Intervention With Comparable Effectiveness to Human-Delivered Care. Journal of Medical Internet Research, 27, e69351 https://www.jmir.org/2025/1/e69351  

Become a Member

Becoming a member of the Psychological Professions Network gives you access to a wide variety of resources and opportunities to contribute and influence