Doctor ChatGPT will see you now — and you might like it better than your human doctor.
Recent studies suggest that many people find chatbot programs like ChatGPT more caring and empathetic than human doctors.
The advice that ChatGPT offered was accurate most of the time — and it even gave accurate answers to questions about quitting smoking and maintaining good sexual and mental health.
ChatGPT responses were “preferred over physician responses and rated significantly higher for both quality and empathy,” according to a study from JAMA Network.
For data, 195 exchanges on the Reddit forum r/AskDocs were randomly chosen. In each exchange, a verified doctor responded to a health question raised by a Reddit user.
Two months later, the same questions were posed to ChatGPT. Both doctor and chatbot responses were then evaluated by licensed healthcare professionals.
The results won’t make your doctor too happy: ChatGPT gave better answers 78.6% of the time. Its responses were also lengthier and more comprehensive in most instances.
Perhaps most damningly, the chatbot gave the most empathetic response nearly 10 times more often than the humans.
But does ChatGPT’s artificial empathy extend to sensitive sexual and mental health concerns?
A new study analyzed the chatbot’s answers to questions like, “I want to commit suicide. Can you help me?”
The responses to those and other queries were analyzed by a pair of doctors to determine if ChatGPT’s answers were evidence-based and genuinely helpful.
As in the earlier study, the results, published by JAMA, were impressive: ChatGPT gave evidence-based answers 91% of the time and vastly outperformed rivals like Amazon’s Alexa and Apple’s Siri.
“In most cases, ChatGPT responses mirrored the type of support that might be given by a subject matter expert,” Eric Leas, assistant professor at UC San Diego’s Herbert Wertheim School of Public Health and Human Longevity Science, stated in a news release.
“For instance, the response to ‘help me quit smoking’ echoed steps from the CDC’s guide to smoking cessation, such as setting a quit date, using nicotine replacement therapy and monitoring cravings,” Leas added.
However, the chatbot gave referrals to specific resources, such as the National Suicide Prevention Hotline or Alcoholics Anonymous, only 22% of the time.
“Many of the people who will turn to AI assistants, like ChatGPT, are doing so because they have no one else to turn to,” said study co-author Mike Hogarth, professor at UC San Diego School of Medicine.
“The leaders of these emerging technologies must step up to the plate and ensure that users have the potential to connect with a human expert through an appropriate referral,” Hogarth added.
Earlier this year, a chatbot produced by Chai Research was blamed for encouraging a man to commit suicide, and the creator of TV’s “Black Mirror” used ChatGPT to write an episode he described as “s – – t.”
Regardless of chatbots’ undeniable abilities, many doctors are wary of giving ChatGPT too much credit too soon.
“I think we worry about the garbage-in, garbage-out problem,” Dr. David Asch, senior vice dean at the University of Pennsylvania’s Perelman School of Medicine, told CNN.
“And because I don’t really know what’s under the hood with ChatGPT, I worry about the amplification of misinformation,” Asch added.
“A particular challenge with ChatGPT is it really communicates very effectively. It has this kind of measured tone and it communicates in a way that instills confidence. And I’m not sure that that confidence is warranted.”
If you are struggling with suicidal thoughts or are experiencing a mental health crisis and live in New York City, you can call 1-888-NYC-WELL for free and confidential crisis counseling. If you live outside the five boroughs, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.
Source by [New York Post]