Beware of Chatbots: AI Is Not Your Therapist!
- Press
Peter Bakonyi, Data Scientist Senior from Aliter Technologies
Occasionally, we ask chatbots simple or even primitive questions that would likely earn a puzzled look from colleagues or friends. With chatbots, this doesn’t happen; in fact, we might even receive praise for asking a “great question,” and the response is delivered with a positive tone. For this reason, one of the main risks is the potential increase in human loneliness. Whether due to personal circumstances or limited social skills, people may fall into loneliness and, subsequently, depression.
The Threats of Chatbots to Society
Chatbots are designed to appear human-like and competent. They are always willing to respond, praise almost any request, and provide answers regardless of whether the information is correct. Overreliance on chatbots without verifying their responses creates an artificially inflated trust in systems that are useful but require oversight. Modern society often expects AI to replace certain traditional professions entirely and achieve full automation. In reality, these systems currently serve only as assistants, not substitutes for humans. For example, in translation, AI can translate books very quickly, but the results are almost always prone to significant errors. Among people under 30, critical thinking is decreasing due to the perceived reliability of “facts” provided by AI.
The Illusion of a Kindred Spirit and Emotional Bonding
When using AI for discussions or advice, people often perceive it as a helpful assistant or friend who listens, praises, advises, and is always available. Feelings of loneliness can lead to forming an emotional attachment to the system. Recently, personalization features have increased the comfort and usability of AI systems, adapting the tone, technical level, or semantics of communication to the user. This has shifted interactions from problem solving queries to a more friendly, conversational style, creating a “Tamagotchi effect” an emotional bond with a machine.
This effect can serve as an escape from an unpleasant reality into a comfort zone or bubble, where our thoughts and opinions receive only positive feedback. Psychologically, people tend to get along better with those who share their mentality and agree with them. This applies in both professional and personal spheres. A negative consequence is reduced interest in human relationships and communication. For individuals who are not naturally extroverted, attachment to machines can develop faster and be more dangerous. In rare cases, loneliness and dependency on AI have progressed to the point of forming what feels like an official relationship. In such situations, consulting a professional is advisable to avoid complete disconnection from reality.
AI as a Therapist
Therapy is becoming more widespread in Eastern Europe, and people are interested in working on themselves and healing their minds. However, therapy is often costly, which has increased the popularity of free AI-based therapists in the form of chatbots. While conventional chatbots exist widely, there are also specialized platforms and apps. It is important to understand that current AI can only repeat what it has learned from past data. It can reasonably identify common, easily recognized problems in combination with patient interaction and mimic a therapist. However, if AI misdiagnoses or misidentifies the root of a problem, it may attempt to “treat” something that is not the actual issue.
Personal Data
Careless communication with AI can result in sensitive personal data being accessed by third parties, who may sell it as a product. While we may think that the information we share with AI isn’t particularly sensitive, even partial, extended conversations can reveal significant personal details. Third parties can use these fragments to build a profile, predicting consumer behavior, political orientation, or trust in information sources. Although it may not seem significant, such data is valuable, and the market for trading this information is extensive.
Chatbots on Online Dating Platforms
Younger generations increasingly use online dating platforms, a trend well established not only in Western countries but also in Slovakia. When potential partners match, the next stage is often text-based communication. Simple openers like “hi” can be unoriginal, and interest may wane. Consequently, users increasingly rely on AI to generate original opening lines, and sometimes entire conversations are AI-generated rather than between real people.
When the interaction progresses to in-person meetings, weak communication skills often become apparent. Excessive reliance on AI can lead to dependence, where one’s perception of a potential partner is influenced by AI guidance. Instead of forming their own opinions, users may rely on AI generated perspectives. From the perspective of an ideal conversational partner on a dating app, AI could even be interacting on the other side instead of a real person with potentially harmful intentions.