
Artificial intelligencein its various forms, it has penetrated deeply into the lives of many patients with whom I work in my clinical practice. Nowhere is this more evident than in the stories he tells about using artificial intelligence to navigate complex emotional terrain.
I’ve heard more and more stories of people outsourcing their most personal communications to AI chatbots like ChatGPT, from writing notes to a demanding boss, writing a goodbye letter to a lover, or even writing a poem for a dying parent. In these conversations, I was introduced to a new emotional economy: one in which algorithms mediate human self-expression, providing opportunities and challenges for psychological growth.
But it’s not just for adults. Concerned The New York Times article told the story of a teenager in between commit suicide thoughts are addressed to ChatGPT, not to a parent, peer, or counselor therapy. The teenager described the chatbot as a lifesaver in times of despair.
This extraordinary and troubling example reveals the promise and danger of AI for young people. AI can provide instant convenience and structured communication, but it can replace real human contact when it’s needed most. As educators, parents, and clinicians, we need to recognize that students are engaging with AI not just as a tool, but as a companion in their inner lives.
Artificial intelligence allows the creation of a “false self”, which destroys relationships
In my practice, I have observed several ways that people use ChatGPT. A vivid example of this can be given projection “false self”. Hot and empathic the style faced a dominant boss who demanded a strong, decisive presence. He instructed ChatGPT to “write a memo that sounds active, masculine, and authoritative.” The result was effective, but it cut him off from his true self. This is a reflection of what many students experience at school, where the social ego and peer pressure may encourage them to adopt voices and characters that do not reflect their inner lives. Artificial intelligence becomes a tool for this projection and helps them bypass reality person.
Another patient who was paralyzed by the task of writing a divorce letter turned to ChatGPT. The first attempt sounded like a notice of corporate liquidation. The revised version was smoother, but as he put it, “not me.” Outsourcing its weakest link offered short-term relief but highlighted a deeper avoidance. closeness. Many students use AI-generated essays, e-mails, or messages to avoid awkwardness while struggling, and do similar things with academic tasks or even social conflicts. While currently functional, it takes them away from the developmental work of finding their own words.
A third patient asked ChatGPT for a humorous yet loving poem for his elderly mother. The AI made clever jokes and polished lines, but they were contrived and oddly hollow. The patient was satisfied – it met social expectations – but there was no emotional depth. Teens also often use AI to craft the “right” message to their peers, teachers, or parents. It highlights the tension between the polished performance of communication and the messy, authentic expression that deepens relationships.
Beyond individual cases, I see a broader trend: when patients turn to AI shame prevents direct communication. One patient, embarrassed by financial difficulties, asked ChatGPT to write an application for a fee reduction. The result was formal and transactional in contrast to his usual cordial style. I suddenly found myself in a three-way relationship not only with him, but also with his AI voice. Students may do the same when asking teachers for more time, coaches for more playing time, or peers. forgivenessoutsourcing the courage required to ask directly.
Disputed couples have even used ChatGPT as mediators, only to later discover that both were relying on AI to write conciliatory messages. This “assistant therapist” role can regulate emotions and prevent escalation, but it begs the question: Will humans transition from AI to each other, or does AI risk becoming a permanent buffer to intimacy?
The The New York Times The story of a teen who used ChatGPT for therapy underscores what I hear from young people every day: Students crave immediacy, structure, and a sense of being heard, sometimes in places adults don’t expect. For young people worry, depressionor everyday turbulence adolescenceAI offers an accessible and non-judgmental listener. But there is both danger and opportunity here.
If students rely solely on artificial intelligence, they may be bypassing the important developmental process of learning to express their vulnerabilities with real people, including parents, teachers, peers, and mentors. However, AI can also serve as a transitional tool: a first step to lower the threshold for expressing emotions, practicing language, or asking for help. In this sense, it can be part of a continuum that leads to human contact rather than away from it.
Teachers and therapists can support human connection
As AI becomes more integrated into everyday life, therapists and educators must grapple with its role. The question is not whether to accept or reject artificial intelligence, but how to integrate it thoughtfully. Can we encourage students to use AI as a tool for self-reflection and guide them toward authentic human relationships? Can AI support development tasks rather than replace them?
What is clear is that AI is already reshaping the landscape of communication, learning and therapy. The inner life of students is increasingly surrounded by the voice of algorithms. As adults, we must pay close attention attentionnot only to emotional risks shift and decreased authenticityas well as the potential for artificial intelligence to serve as a stepping stone to deeper connectivity, enduranceand growth.
If you or a loved one is having suicidal thoughts, get help right away. For help, dial 988 for the Suicide & Crisis Lifeline or contact the Crisis Text Line by texting TALK to 741741. Visit to find a therapist near you. Directory of psychology therapy today.




