Growing Use of AI Tools for Emotional and Life Advice
AI chatbots have become a common outlet for people looking for emotional support. Their availability and low cost make them attractive, particularly for teens and young adults who are comfortable with technology. Surveys show that a growing number of people turn to AI when they feel overwhelmed, lonely, or uncertain. While the appeal is clear, psychologists caution against treating these tools as a substitute for licensed care.
AI can supplement support, but it cannot replace the guidance and accountability of professional therapy.
David Tzall, Psy.D, Clinical Psychologist, raises some concerns with this growing trend and encourages anyone struggling with their mental well-being to seek effective, proven methods.

Psychologists Warn About Confirmation Bias in AI Responses
Confirmation bias occurs when people only seek or accept information that supports what they already believe. AI systems, designed to please users, often reinforce those beliefs rather than challenge them.
The risk is clear: people may walk away from an AI exchange feeling validated but without any real perspective shift. Over time, that can limit problem-solving, increase rigidity, and create blind spots.
AI may feel supportive, but it can quietly reinforce unhealthy patterns.
According to a recent Kantar survey, 54% of consumers worldwide have used AI for emotional or mental well-being, with younger generations reporting the highest usage.
Lack of Regulation and Clinical Oversight in AI Emotional Tools
No AI tool is FDA-approved to diagnose or treat mental health conditions. There are no formal licensing systems, safety standards, or ethical guidelines that govern their use in emotional support. Without oversight, users risk receiving misleading or even harmful advice. For someone in crisis, relying on AI instead of reaching out for professional help can carry serious consequences.
Benefits and Limits of AI in Mental Health Support
AI tools can support mental health in helpful ways. They can remind users to practice coping skills, track moods, or provide prompts that encourage reflection. For some, these features create structure and accountability between therapy sessions.
Yet the role of AI stops there. It cannot replace therapeutic work, which depends on human connection, trust, and professional judgment.
Real healing requires a human connection that AI cannot replicate.

How to Approach AI Wisely for Emotional Support
AI should be used as a supplement, not a substitute. The most responsible use involves pairing technology with professional care. To use AI safely:
-
Use AI for supportive tools, not as your main source of guidance
-
Seek licensed professionals when deeper issues surface
-
Stay mindful of confirmation bias and how AI may echo existing beliefs
-
Balance screen time with relationships that provide real support
Therapy remains the best setting for lasting growth, evidence-based strategies, and safe exploration.
David Tzall’s Perspective on the Responsible Role of Therapy
Dr. David Tzall emphasizes that therapy provides strategies grounded in research, consistent accountability, and emotional safety that AI cannot replace. Technology may play a role, but it should always complement—not substitute—the work done with a trained professional.
For inquiries or to learn more about Dr. Tzall’s practice, visit www.davidtzall.com or contact his Brooklyn office directly.