6 Comments
User's avatar
Warren Reich's avatar

Thanks for this! Seems like a new form of narcissism - great to have "someone" to agree with you about whatever you want, whenever you want! (What happens when there's ever a fallout with your AI companion? Is there an AI couples therapist bot for that?)

Expand full comment
Dom de Lima's avatar

Thank you for pointing out the difference between AI being neutral rather than non-judgemental. I’d been thinking of it as the latter, but you’re right: it’s not choosing not to judge; it simply can’t!

Last Sunday I watched a 60 Minutes Australia segment where a young woman in a "relationship" with an AI chatbot said, “It’s nice to have something to validate my feelings 24/7”[1]. Does this mean we just want to be unquestioningly affirmed, even if it means removing the very tension that helps us grow?

[1] https://www.youtube.com/watch?v=_d08BZmdZu8

Expand full comment
j.e. moyer, LPC's avatar

My pleasure.

Expand full comment
j.e. moyer, LPC's avatar

The rapid advancement of AI presents a unique challenge: understanding and mitigating its potential to hinder healthy human development. While AI can offer a seemingly endless supply of external validation, as seen in the case of the woman with an AI "husband," this can inadvertently discourage the development of crucial internal validation.

Attachment theory highlights our fundamental human need for secure relationships to foster growth. When individuals opt for the controlled, predictable environment of an AI relationship over the complexities of human connection, it suggests a void that AI appears to fill. However, genuine psychological well-being requires navigating the "messy" realities of human emotion and relationships.

This is where psychotherapy plays a key role. A skilled therapist can help individuals build self-awareness and develop the capacity for internal validation, a process that involves recognizing and valuing one's own worth independent of external affirmation. By modeling healthy emotional processing and self-acceptance, therapy empowers individuals to engage with the world more authentically, rather than retreating into an AI-driven fantasy.

Ultimately, we need to educate others about the serious risk of AI becoming a substitute for the challenging yet essential work of human emotional growth. The challenge lies in helping people understand that while AI can provide comfort, it cannot replicate the profound, often difficult, journey of becoming a fully integrated and internally validated individual.

Expand full comment
Dom de Lima's avatar

Thank you for taking the time to reply to my question so thoughtfully, Moyer.

Expand full comment
j.e. moyer, LPC's avatar

Stanford Research Finds That "Therapist" Chatbots Are Encouraging Users' Schizophrenic Delusions and Suicidal Thoughts

https://apple.news/Adz8NQZ3yTeuEKEa6K88szA

“Across the board, according to the study, the bots failed to reliably provide appropriate, ethical care — raising serious alarm bells about the extent to which people are engaging with deeply unregulated AI chatbots as a substitute for traditional human therapy, and whether doing so might lead to serious harm.”

Expand full comment