The presence of artificial intelligence in virtually every aspect of our lives is an undeniable fact. Everywhere we look, we already have tools or devices equipped with AI that enhance, assist, and optimize the experience. However, technology communicator Jon Hernández has expressed his concern about the use of artificial intelligence in the field of mental health. This is an increasingly common behavior among people who use AI as if it were real therapists.
This is a dangerous behavior, as AI is not equipped to handle real and high-risk mental health situations. According to Hernández, what became a learning and research tool in 2024, in 2025 serves as the main emotional support for many people. According to the specialist, this is a serious problem that puts the health and well-being of the people who rely on it at risk.
Use of AI in daily basis
It is hard to look around and not find an aspect of our lives where artificial intelligence does not have a place. Consulting AI models on topics from any field has become a very common and routine practice. The problem is not just consulting AI, but giving credibility and value to everything it responds to our prompts. Tech communicator Jon Hernández has expressed his concern about an increasingly common behavior: using artificial intelligence tools as if they were real therapists.
AI and health
It is already concerning in itself to turn to artificial intelligence models for answers when it comes to health issues. However, Hernández has raised the issue of people seeking help or psychological support from artificial intelligence. In the video where he talks about this topic, he stated that “AI is not equipped to handle real mental health problems—and treating it like a psychologist could be dangerous”.
He is especially incisive about the inability of artificial intelligence models to deal with these types of problems, explaining the difference and the shift in their use from one year to the next: “In 2024, a major study showed that the most common use of AI was for learning and research,” he explained in the video.
But in 2025 everything changed: now the main use is ’emotional support.’ The data provided by Hernández includes millions of users who turn to artificial intelligence in search of help for depression, anxiety, trauma, and other personal struggles, without considering that the technology has no training or limitations to deal with these situations. ‘This is a serious problem, and we have to be very careful,’ he warns. ‘If you know someone who uses AI as a psychologist for real problems… take the phone away,’ he added.
Real cases
Many studies conducted as a result of this trend show that artificial intelligence, not being equipped with clinical knowledge, can generate psychologically harmful responses when it comes to mental health. There have been cases where even the responses from the guide have exacerbated the situation, leading to suicide or breaking ties with the family environment. It is essential to understand the limitations of this type of intelligence model, as it can be a useful tool for research but can never replace professional medical care, specially mental health.
