Paging Dr. Bot.
In a new study, researchers at the University of Kansas’ Life Span Institute found that parents trust artificial intelligence, such as ChatGPT, more than healthcare professionals.
“Participants found minimal differences between the vignettes written by the experts and those created by the rapidly generated ChatGPT,” said Calissa Leslie-Miller, a doctoral student in clinical child psychology at the university and lead author of the study. “When vignettes were statistically significantly different, ChatGPT was rated as more reliable, accurate, and reliable.”
The team conducted a study with 116 parents aged 18 to 65 who were given health-related texts for children.
Each participant analyzed the content and determined whether they believed it was produced by ChatGPT or healthcare professionals without knowing the original author.
Although the study did not examine why parents trusted ChatGPT more, it details factors that may contribute to their preference.
Jim Boswell, president and CEO at OnPoint Healthcare Partners, who has experience developing an AI-based platform, believes ChatGPT’s straightforward approach to presenting information directly makes it easier for people to digest.
“I can understand why [parents]not knowing the source, would prefer the wording of AI,” said Dr. Mordechai Raskas, chief medical information officer and director of telemedicine at PM Pediatric Care. “Think of AI as the ultimate salesperson; he knows exactly what to say to win you over.”
Parents prefer to rely on AI because they can get quick answers to their problems without waiting for a doctor’s appointment.
However, while using ChatGPT can be a quick fix for many parents, it does come with some drawbacks.
“The information may be inaccurate or not adapted to specific circumstances. For example, suggesting medication to a child who is too young or providing the wrong treatment advice can lead to a wide range of dangerous outcomes,” said Leslie-Miller.
Experts suggest checking the sources of your AI-generated answers or consulting a medical professional before applying it to your condition.
“Reputable health content typically gives credit to qualified medical writers or health professionals and links to research-backed resources,” Boswell added.
Artificial intelligence like ChatGPT collects information from various sources on the Internet and summarizes it into a response. But when it comes to AI health information, these responses lack a medical expert’s opinion that is personalized to the patient.
“Relying on these tools for medical advice can lead to missed symptoms, misinterpretations of serious conditions or delays in seeking appropriate care,” Boswell said. “For children, in particular, minor health issues can quickly escalate, so having a qualified professional assess a situation is essential.”
Leslie-Miller recommends that parents also use trusted online medical resources such as the American Academy of Pediatrics, the Centers for Disease Control and Prevention, and the World Health Organization. Some hospitals also provide health information and advice from their health care providers.
“Reading online and researching can be very helpful,” Raskas said. “It just depends on the context and should be in conjunction with a trusted source or professional to help digest what you’ve read.”
#Parents #trust #ChatGPT #doctors #shocking #study #claims
Image Source : nypost.com