
A 14 -year -old child in Mumbai was taken to the emergency of Mumbai Apollo Hospital after complaining of stomach ache. Where many medical tests were done but no physical disease was found. Then the child’s mother told that the child had consulted the first chat bots about her health. The chat bots connected the child’s stated symptoms to the gastro problem and immediately advised them to take them to the hospital. When some experts saw this case, it was found that children had a problem of serious anxiety attack, not stomach pain. He was getting mental pressure due to the continuous calling of seniors in the school. This stress was causing its physical symptoms.
AI doing wrong diagnosis
Some experts say that recently they have seen many such cases. In which patients took mental health advice from AI Chat Bots and their condition worsened. Experts further explain that AI listens to your words and also answers but cannot see you and cannot feel. At the same time, the importance of real connection in life cannot understand a machine. Therefore, AI gives you many times from health and wrong advice.
Advice AI Chat Bots, but whose responsibility?
Nowadays many people take help of AI chat bots for mental problems. Because AI Chat Bots gives private and non -judicial space. But the expert believes that using AI chat bots in this way is not free from danger. According to a report by Stanford University, AI Tool is not an option of effective medical health. It can also ignore serious mantle conditions. At the same time, AI can also give many wrong advice to people struggling with mental health.
seek help from AI but also understand the boundaries
A survey conducted in America found that about 50 percent people who are using AI tools. He really wants a conversation like therapy. AI can be a limited option in resources lacking areas. But it is not reliable and safe. It cannot fulfill moral responsibility like a human being. Therefore, it can also pose a threat to people.
Technology is necessary but cannot replace human beings
AI based tool can sometimes get positive support, mood tracking or any other kind of information. But this may not be the option of treatment. Actually, the human therapist who can experience sensitivity and may not give a machine. That is why there is a need to use AI in a limited role and only experts should be advised in serious problems related to mental health.
Read also- Rice water is not only its flour for the skin, it is also beneficial, it is how to apply
(tagstotranslate) AI Health Advice (T) Mental Health Risk (T) Chatbot Misdignosis (T) Chatbot Misdignosis (T) Anxiety in Teens (T) Digital Therapy Danger (T) AI Advice (T) Mental Health (T) Chatbot Danger (T) Insayty Attack (T) Wrong Diagnosis (T) School Bulling (T) Mental Stress
Source link