- AI for Non-Techies
- Posts
- Can AI really help your mental health? 🤨
Can AI really help your mental health? 🤨
Some tools, prompts and extra resources for anyone who needs them.

Hi Non-Techies,
I’ve spoken about mental health in this newsletter a fair bit before. Some of our long-time readers might remember this email, in which I talked about how my custom anxiety GPT once calmed me down during an anxiety attack in the back of a taxi.
Well, things move fast in AI (have I mentioned that before?), so today I’m sharing some more tools, resources, prompts and thoughts on how AI-powered mental health support is evolving.
With the exception of some of the tools (which have paid tiers), everything mentioned in this newsletter is free.

The AI for Mental Health webinar 🎬
I hosted a webinar last week about AI for mental health. On the panel were three experts in this department:
Jake Mills, the CEO of mental health charity Chasing the Stigma and their mental health support directory, Hub of Hope.
Nick Jemetta, an AI coach and mental health speaker.
Chris Reinberg, the CEO at Mindsera (and AI-powered journal).

The four of us spoke about our own mental health stories, but also discussed the tools we use (and avoid), some prompts and, perhaps most importantly, the risks and limitations to using AI for mental health.
If you’re the type that prefers to watch rather than read, you can watch the webinar here:

Three AI tools for mental health ⚒️
AI isn’t just for creating fun pics and drafting delicate emails. The development of AI-powered mental health support tools has been a fantastic application of AI’s immense potential.
Here are three tools we’d recommend for slightly different needs:
1) Headspace (feat. Ebb)
Headspace has been around since 2010, long before AI became the thing even your Nan talks about.
Predominantly, it functions as a meditation and sleep assistant, but its new AI co-pilot, Ebb, can do much more.
As Nick discussed in the webinar, he was impressed with the signposting to external, real-world services that Ebb provided.

2) Mindsera
I love Mindsera, in part because I think journaling is seriously underrated. It’s like going to the gym, but for your mind (way less sweaty, most of the time).
But Mindsera isn’t just a journal. As its homepage declares, it’s a journal that reflects back. It encourages you to explore particular feelings, invites you to dig deeper, and guides you towards getting more out of your journaling.
Super simple on the surface, but with some features that make great use of the science of mental health.
Would you consider using an AI tool for mental health support? |
3) Pi
Don’t worry, I’m not talking about the mathematical concept here. Pi.ai is a chatbot not dissimilar to Claude or ChatGPT, but with an emphasis on having empathetic, supportive conversations.
If you’re wrestling with something and you need a conversation about it, Pi is a great option.
I know a lot of people using Pi, and its simple, elegant interface makes it super Non-Techie-friendly too.


The risks of using AI for mental health 🧐
“Any AI tools you’re using outside of your core, you wanna be looking for whether they’re talking about safety”.
I couldn’t agree more with this point from Nick during the webinar. As with all burgeoning tech, the speed of Techie progress in AI is outpacing the speed with which effective safety measures can be put in place. That’s why I love it when tools are built with a safety-first approach from the start.
I’m using “safety” as a bit of a catch-all term here. In the literal sense, it means ensuring that the tools and resources you’re using are dedicated to safeguarding your well-being above all else, signposting to other services when needed.
But it can also mean the way your data is used, the amount a tool hallucinates or the baked-in biases that are so common in even the most sophisticated AI software.
And not every tool is getting this right. I tested Replika, a tool designed for young people that lets users create avatars with which to discuss mental health issues. It’s a great idea in theory, but in practice, the responses to my test questions were concerningly dismissive in my view:

As Jake from Chasing the Stigma explained, AI awareness is increasing in the UK, which is amazing, but services are stagnating. AI tools can bridge that gap, but that’s a huge responsibility for burgeoning tech, and it’s critical they get it right.
Remember: Every responsible AI tool for mental health will have a dedicated, comprehensive section on its website about safety.

Some more helpful resources:
💝 My mental health prompts sheet. These work for me, but feel free to customise them for your own needs!
🧠 My custom anxiety GPT. Again, this was designed for me, so…
🔧My custom GPT builder tool, for those of you who want to build your own custom GPTs.

I love that AI is being used to provide easy-to-access, affordable mental health support to anyone who needs it, and I’m excited to track its progress.
But whilst AI tools can provide a great starting point, it’s critical that they’re built responsibly, with signposting to other services as standard. Anything less, and the consequences could be devastating.
Thanks for reading - my replies are always open!
See you next week,
Heather and the AIFNT team.
Did you enjoy this newsletter?It's always great to get your feedback... |

Reply