Why people are turning to Chat GPT for emotional support

Let’s talk about ChatGPT.
Because let’s face it, it isn’t going anywhere.
And there are deeper reasons why people are using AI to fill emotional voids.

Fear, AI and the Therapist Conversation

There has been a lot of discussion about AI lately, along with a lot of fear mongering, even within therapist communities. I have heard people say ChatGPT is taking our jobs, followed by posts insisting that could never happen.

To be honest, it is confronting that it is here, and it makes sense that people are reacting strongly.

What I am more curious about are the deeper reasons people are leaning into AI for emotional support, particularly around soothing emotions and self understanding, rather than shaming them for it.

Emotional Validation, Loneliness, and Self-Protection

The biggest theme I have witnessed is that people who are more inclined to use ChatGPT and AI for emotional validation or self understanding are often lonely, afraid, and protecting their vulnerabilities. These vulnerabilities may have been exploited in the real world.

If you are familiar with parts work, AI can easily become a powerful protector for exile parts.

If someone grows up expected to be perfect and emotionally contained, and a tool comes along that helps them present flawlessly, articulate themselves perfectly, and never get it wrong, bingo. Why would you not use it?

If someone is not allowed to be wrong in conflict, and being exposed, challenged, or vulnerable feels unsafe, and AI can help construct airtight arguments and flawless points, bingo. Why would you not use it?

If someone has a hard time understanding social cues such as tone, subtext, and unspoken rules, and AI can slow things down and break these moments into something more understandable and predictable, bingo. Why would they not use it?

If someone is deeply lonely, with no real felt sense of connection, and a system comes along that validates their feelings, mirrors their inner world, and helps build self esteem, again, bingo. Why would they not lean into that?

A Note on Harm and Misuse

I have also noticed a rise in perpetrators using AI as a tool within domestic violence contexts. I am curious how long it will take for research to catch up with what many of us are already observing in practice.

AI, Attachment, and What It Cannot Replace

AI becomes especially attractive when someone has learned they cannot be messy, needy, wrong, or exposed, and when there has been no safe guidance or place to turn to (a safe adult).

It offers guidance, parental love and care, and relief from loneliness.

AI can only take you so far. Proper therapy can support this more deeply. However, until we address the deeper reasons, as a community, why so many people are turning to AI for guidance, for the parental love and care that was not present growing up, and for relief from loneliness that has existed in families for generations, it will continue to serve this purpose for people.

Shaming and fear mongering do not stop people from using it. They just push people further away from support.

Final Thoughts

If any of this resonates with you, it makes sense why you may be using tools like ChatGPT. It can be helpful, supportive, and regulating at times, but it can only take you so far.

Therapy can help you explore these patterns in a deeper way and support you to understand where these needs come from, how they show up for you, and how they can be met and integrated over time - so don’t need to rely on Chat GPT and can actually build the connections you need.

If you would like support with this, you are welcome to reach out to book an initial session. I’d love to hear from you.

Previous
Previous

How do we actually love ourselves? (When it has never been easy).

Next
Next

Scarcity Mindset: Why Fixing Your Thinking May Not Be Working…