There is growing interest in utilizing ChatGPT, a generative AI, but medical experts remain cautious.
There is growing interest in utilizing ChatGPT, a generative AI, but medical experts remain cautious.

There is a growing public interest in the use of generative AI ChatGPT in medicine.

Still, the medical community remains cautious, with experts pointing out that applying it to medical education is premature, and its clinical applications must also be limited.

At a symposium titled “The Future of Healthcare with Generative AI” at Yonsei Biomedical Research Center for Convergent Medical Technology last Friday, some participants expressed a view that the medical use of generative AI can’t help being limited, citing the problems of “accuracy” and “fact boundary.”

Asked whether ChatGPT can be used for medical education right now, Professor Kim Jun-hyuk of Yonsei University School of Dentistry said, “No.”

“That’s because it's inaccurate and doesn't understand the facts," he said. "The language model doesn't distinguish between what's true and what's not. If this problem is not solved, it should not be used for medical education."

The medical applications of generative AI can’t help but remain limited, according to the researchers, who cited the problems of “accuracy and fact boundary” at a symposium last Friday.
The medical applications of generative AI can’t help but remain limited, according to the researchers, who cited the problems of “accuracy and fact boundary” at a symposium last Friday.

Professor Kim went on to say, "ChatGPT is not as knowledgeable as it should be. There are various areas of medical work, so there are ways to utilize it. However, ChatGPT does not give better answers than experts. It is also problematic that inaccurate information is supplied to students.”

To make the most of ChatGPT in medical education, he also emphasized that guidelines should first be set.

"In both the educational and medical fields, we need at least guidelines to utilize ChatGPT well," Professor Kim said. "It is obvious that ChatGPT will create problems. It can confuse or deceive people with fake information and lead them in the wrong direction, causing strange reactions and results. There needs to be governance."

In clinical settings with a shortage of medical personnel, some experts suggested that it can be used as a supplementary tool to patient education within a limited scope to reduce the workload of medical staff.

"It's exhausting to lecture caregivers armed with (unverified) information about rare diseases and surgeries at internet cafes. Even after 20 minutes of explanation, I get tired," said Dr. Shim Kyu-won, a professor of neurosurgery at Yonsei University College of Medicine. “We could use interactive AI in such cases."

Professor Shim went on to say, “Even in simple situations, which need basic medical common sense or an MRI, medical staff’s burnout is too high. Suppose AI replaces humans, takes over patients' and guardians' education, and medical staff can handle what it cannot replace. In that case, the medical staff’s burnout and work efficiency will decrease."

 

Copyright © KBR Unauthorized reproduction, redistribution prohibited