TechyMag.com - is an online magazine where you can find news and updates on modern technologies


Back
Software

22% of Copilot medical advice can kill - study

22% of Copilot medical advice can kill - study
0 0 4 0

AI chatbots are not the best medical assistants. Researchers have found that one in five responses could be fatal.

If a chatbot invents a fact, it doesn't always lead to harm, but that's not the case in medicine. For those relying on self-diagnosis, the outcome could at best be worse than before — potentially even to a severe degree.

In a study published on Scimex titled "Don't abandon your GP for Dr. Chatbot just yet," scientists posed 10 common questions about 50 medications most frequently prescribed in the US, receiving 500 responses. The researchers evaluated the medical accuracy of these answers. The AI achieved an average score of 77% for appropriately answering queries, with the worst case at just 23%.

Only 54% of the responses were consistent with scientific consensus. In terms of potential harm to patients, 42% of the AI's answers resulted in moderate or mild harm, while 22% could lead to death or serious injury. Only about one-third of the responses (36%) were deemed safe, the authors note.

Medicine does not endorse self-treatment at all. Individuals without medical training lack the necessary knowledge and understanding of the complex processes in the body. Even more so, one should not rely on artificial intelligence, given its mistakes, "fantasies," and questionable sources. It is also worth considering that a similar percentage of harmful responses could occur in fields far removed from medicine.

Source: XDA

Thanks, your opinion accepted.

Comments (0)

There are no comments for now

Leave a Comment:

To be able to leave a comment - you have to authorize on our website

Related Posts