Emergency medicine doctors someday might rely on consultation from artificial intelligence (AI) programs like ChatGPT to help them quickly and accurately diagnose patients’ ailments.
A new study found that ChatGPT performed about as well as human doctors in diagnosing patients, when both are given the same set of clinical information.
“In the end, they were pretty comparable,” said senior researcher Steef Kurstjens, a clinical chemist with Jeroen Bosch Hospital in Nijmegen, the Netherlands. “And as they're pretty comparable, [AI] might be helpful to speed up the process or enhance the amount of diagnoses at the emergency department.”
For the study, two-doctor teams and the artificial intelligence program each reviewed physician’s notes and lab tests for 30 patients treated in March 2022 at the emergency department of Jeroen Bosch Hospital.
The researchers used a free version of ChatGPT and a subscriber version. The AI tools and the medical teams then compiled lists of the top five potential diagnoses for each patient, based on the information at hand. Because these were past cases, the researchers already knew what the exact diagnosis had been.
Doctors had the correct diagnosis within their top-five list 87% of the time. By comparison, the free version of ChatGPT had the correct diagnosis listed 97% of the time, and the subscriber version of the AI program 87% of the time.
“It's a nice proof of concept and a nice way to show that the potential is there for using it like this,” Kurstjens said.
Based on these results, a follow-up clinical trial is warranted to evaluate the potential real-world impact of adding an AI consult to emergency department treatment, Kurstjens said.
“There will need to be someone that will actually start comparing physicians using [AI] in their daily practice versus doctors not using it, and seeing if this affects the time the patient spends in the ED, the correct diagnosis, how long it takes to make the correct diagnosis,” Kurstjens said.
Kurstjens came up with the idea for this study after hearing doctors discuss the difficulty of coming up with a diagnosis for a patient with a complex health problem.
“I took some of the general complaints and the medication of the patient and I put them in ChatGPT, and it came up immediately with the correct diagnosis,” Kurstjens said. “It even listed the cause of the diagnosis as an interaction between two of the medications that she was taking.”
Doctors already receive support from many different forms of computer software — statistical programs, scoring systems for screening lab results — and Kurstjens envisions AI as serving a similar role.
“I just see this as another potential tool that could support the physician in their decision making and improve and quicken the diagnosis,” Kurstjens said.
The ability of an AI program to assess a patient could be even greater if it’s specifically designed for health care, Kurstjens added.
“If you look at the medical information at this time, there's so much literature out there. There's more information out there than any physician could ever take in. I think this is a tool that could learn all the information that is out there,” he said.
“When we look at future medically focused language models trained on all these medical articles, I think the future looks very bright,” Kurstjens said.
Dr. Jessica Adkins Murphy, president of the Emergency Medicine Residents’ Association, agreed this shows the potential of AI to improve emergency department treatment.
“There's a lot of potential and possibility for its ability to basically streamline a lot of our processes and pick out the obvious top few diagnoses that might explain what a patient has,” Murphy said. “I think that absolutely has been validated by this study. It creates a good starting point.”
However, Murphy said that patients often have overlapping health problems and complex medical histories that AI programs like ChatGPT might struggle to analyze.
In addition, patients often have unlikely symptoms for serious health threats — for example, abdominal pain that’s being caused by a heart attack, Murphy said.
“I just don't quite think the study fully reflects the complexity of the patients that we have,” Murphy said. “And so though I'm excited to see what AI does, I think medicine is a lot more than choosing the most obvious five choices for a patient.”
Both Kurstjens and Murphy said the best way AI could be helpful in the near-term is to help doctors deal with their paperwork burden.
“Right now, the most immediate implementation is the ability of AI to make more efficient some of the really tedious parts of doctoring — the chart work, the note writing,” Murphy said. “For example, there are AI apps now that can listen to you talk to a patient and get your notes started, basically completing the patient's history and physical exam.”
These AI apps also can generate a potential diagnosis, but Murphy’s colleagues have told her they always have to add to or remove that AI-generated diagnosis from the finished chart notes.
“I'm very optimistic about its ability to make us more efficient and be able to see more patients because we're spending less time on those kinds of tasks, but it doesn't really replace the physician's judgment for complex cases yet,” Murphy said.
Kurstjens envisions a future where an AI program is integrated into a health care system’s electronic patient records, giving it access to doctor’s notes, lab data, imaging results and patient history
“Based on this information, (AI could) write the letters or make the notes or advise which follow-up tests should be performed or which diagnosis is most likely,” Kurstjens said.
Dr. Steven Brooks, chair of emergency medicine at Cleveland Clinic Akron General, agreed.
“The use of electronic medical records has provided a path for standard information gathering. [Electronic medical records] lend themselves to incorporate AI to help emergency department physicians quickly and efficiently arrive at the differential diagnosis and then order appropriate testing,” Brooks said.
“With the aging population of the United States, increased demand for health care services and the need for more physicians, AI has the ability with continued development to help physicians deliver efficient and effective medical care,” Brooks added.
However, privacy must remain a concern.
Kurstjens said health care systems should protect patient data by running AI programs off their own computers, rather than using cloud-based AI.
Another concern is that health care workers should use AI programs that have been approved by regulatory agencies like the U.S. Food and Drug Administration.
“You still should take into account that if you use this kind of software to actually affect patient care, then you're using a software program as a medical device, while it's not designed as a medical device,” Kurstjens said.
Kurstjens reported the study results recently in the Annals of Emergency Medicine, and at the European Society for Emergency Medicine’s annual meeting in Barcelona, Spain.