Why AI Therapy Appeals to Gen Z — Implications for the Future of Mental Health
Artificial intelligence isn’t just powering tech companies anymore — it’s becoming a surprising new player in the mental health and wellness world. From robot “pets” that ease loneliness to apps that help track your moods, the latest AI tools are designed to make self-care feel more personal, supportive and accessible.
One of the buzziest new inventions is Kakaloom’s AI-powered companion — a cute, robotic “pet” that hums, reacts to touch, and even adjusts its mood based on yours. It’s designed to offer the soothing, calming presence of a real animal, helping combat loneliness and stress.
Another standout is LoomMind, an AI mental health platform that acts like a digital wellness coach. It offers personalized journaling prompts, mood tracking and mindfulness routines — and it’s all based on your emotional patterns. Therapists and wellness pros can also use it to better understand clients and support long-term resilience.
Mental health struggles are rising worldwide. Even before COVID-19, one billion people were dealing with mental health or substance-related issues. After the pandemic, global anxiety and depression jumped 25–27%, according to the World Health Organization.
At the same time, there simply aren’t enough mental health workers to help. On average, there are only 13 mental health professionals per 100,000 people, and in low-income countries, that number can be up to 40 times lower than in wealthier nations.
This shortage means up to 85% of people in some regions don’t receive any treatment at all.
That’s where AI is stepping in — not to replace therapists, but to fill huge gaps in access.
Surprisingly… yes.
A survey across 16 countries found that 32% of people would try AI for mental health support, especially if their symptoms were mild. In places with fewer mental health providers, that number jumps even higher — in India, 51% of people said they’d be open to it.
Younger generations are even more willing:
Some people even said AI felt like a more reliable confidant than a human because it’s always available and consistently “empathetic” in its responses.
Still, professionals warn that AI should not replace human care — especially in severe mental health cases. Apps need safety checks so that if someone’s symptoms worsen, they’re directed to a real human immediately.
There are also privacy concerns, since mental health data is extremely sensitive. Regulators are already cracking down:
The takeaway? AI can be incredibly helpful, but only when used responsibly.
The real goal isn’t to swap out therapists — it’s to support them. When AI handles mood tracking, journaling prompt or early intervention, mental health professionals have more time to provide thoughtful, human connection.