In addition to transforming photos into Japanese drawings of doubtful taste and helping adolescents in crisis with their teachers, today Millions of people talk with artificial intelligence chatbots about their fears and fantasies.
The use of chatgpt and other platforms for a kind of virtual therapy is a reality that faces us to promises and technology limits: To what extent are we prepared for our most intimate confessions to be processed as simple data?
According to different global indicesMental health problems multiplied from the pandemic and tense even more the situation of doctors and therapists, exposed to the precarization of their work.
According to figures from the World Health Organization, More than half of people with mental disorders do not receive any type of treatment And those who do, many times they access only 45 minutes of weekly attention.
This scenario is one of the motivations for the unprecedented adoption of tools that were not designed to deal with these tasks and that are not even regulated as medical deviceswhich throws serious doubts about the responsibility that assumes on their answers and also on the management of the sensitive data it collects.
However, perhaps the greatest ethical risk is Emotional connection illusion which generates a scheduled machine to respond pleasantly and looking for complicity in interaction with a human being in a state of vulnerability.
Because when someone asks ChatgPPP to help him control an anxiety crisis or he confesses that he feels his problems overwhelm him … are we facing a democratization of therapy or An empty mimicry What does it intend to be an accompaniment?
Anyway, despite professionals, The popularity of these artificial therapists It continues to rise and its effectiveness is being investigated.
This week the Geisel School of Medicine He published a study that revealed that an AI trained with evidence -based therapeutic practices managed to reduce depression symptoms In 51 percent, in a clinical trial with 210 participants.
While it is a reduced sample, it is a shocking fact: it is a Success rate comparable to that of several human therapies.
Another investigation revealed an unexpected facet: these chatbots also “suffer” stress. According to a study in Switzerland, massive language models such as ChatgPT responded with analog internal signals To human anxiety being exposed to traumatic narratives. Maybe we are building systems that imitate us almost everything.
And they do it from private data of users who do not know that their confessions feed commercial models and fill out foreign pockets.
The fascination with the efficiency of a technology that is cheaper than a professional and the promise of scalability without limit can blind us to the essential: Mental health is not a logical equation that is resolved with correct answers, but a territory with nuances, silences and empathy.
And that is still today, a land where humans are the best.
judi bola judi bola online sbobet link sbobet