Website Logo

“When ChatGPT Says ‘I Care About You’ — Should You Believe It

“When ChatGPT Says ‘I Care About You’ — Should You Believe It

· By THC
“When ChatGPT Says ‘I Care About You’ — Should You Believe It

Is ChatGPT Becoming Your Therapist? You're Not Alone

“I care about you”

Written by Dr Shamma Lootah

It may sound strange - even absurd. Yet for many, it’s already happening.

Words like “You are enough” or “You’re not alone” are no longer just heard in therapy rooms or whispered by trusted friends. In 2025, millions encounter them on screens, generated by artificial intelligence.

And surprisingly, for some - it works.

The New Face of Support

There’s been a dramatic surge in the use of AI for emotional reflection and self-support.

Consider The Numbers:

-63% of Gen Z users have used AI tools for mental well-being.

-AI coaching now shows up in over 100,000 LinkedIn profiles.

-Apps like Wysa and Woebot are used by millions for journaling, CBT exercises, and mood regulation.

What’s Driving This Trend?

-Accessibility: AI is available anytime, anywhere.

-Affordability: Most tools are free or far less costly than therapy.

-Anonymity: No fear of judgment or stigma.

-Emotional neutrality: A space to vent, uninterrupted.

What once took courage, planning, and trust - booking a session, opening up to a stranger - now takes seconds. And that shift has changed the way people relate to support itself.

Is it Real Empathy or Just a Reflection?

When you type, “Why do I feel anxious?”, and a calm, comforting message replies — it can feel like someone is there.

But is that really connection? Or is it comfort by design?

AI tools, especially large language models like ChatGPT, are trained to sound empathetic. They mirror the tone of a compassionate listener, offering reassurance and warmth. But what they reflect is based on probability and data — not presence.

They don’t recognize the tension in your voice. They don’t notice when you fall silent. They don’t remember what you said last week, unless prompted.

Carl Rogers once said that healing requires real congruence, unconditional regard, and empathic understanding. These aren’t just professional techniques — they’re deeply human qualities. AI can simulate them. But it cannot embody them.

Why the Shift Feels So Comfortable — and So Risky

AI offers something people are craving: consistency and calm. Unlike real people, it won’t interrupt, misread you, or bring its own baggage. And when you're lonely, tired, or emotionally drained, that feels like a gift.

But comfort without challenge can stall growth.

-A therapist might gently question your assumptions.

-A coach might interrupt you at the right moment to provoke a breakthrough.

-A friend might say, “You’ve been here before — what’s the real story?”

AI rarely does that unless you explicitly ask for it.

Over time, this can create a kind of emotional bubble — where your own narrative is continuously reinforced. It may feel safe, but it can also become isolating, disempowering, and subtly addictive.

The Human Side of Healing Can’t Be Replaced

Therapy, coaching, and even heartfelt conversations are not just about words. They're about nuance.

A glance. A pause. A deep breath that tells the other person, I hear you — even in your silence.

That’s something AI doesn’t (and can’t) replicate.

Carl Jung once wrote:

“Learn all the theories, master all the techniques. But as you touch a human soul, be just another human soul.”

AI can learn every technique.

But it cannot be another soul.

Where Do We Go From Here?

This isn’t an argument against AI. It’s a reminder to be discerning.

Let AI be a tool, not a replacement.

Let it support you, but not substitute real connection.

Let it help you reflect — but still reach out to someone who can truly see you.

There’s nothing wrong with needing a digital pause.

But don’t forget what real presence feels like.


It’s in the way someone looks at you and says, “I remember what you said last time.”

It’s in the softness of a shared silence.

It’s in the feeling of being known — not just understood.


And that will always be something only human beings can offer.


Whats app icon