ChatGPT may diagnose emergency room patients as well as doctors … – Euronews
A new study suggests that ChatGPT could suggest diagnoses for patients as well as a trained doctor.
ChatGPT could help to diagnose patients in future emergency rooms, according to a new pilot study that looked at how the large language model could be used to support doctors.
The research published in the Annals of Emergency Medicine journal found that the artificial intelligence (AI) chatbot diagnosed patients as well as trained doctors. It is set to be presented at the European Emergency Medicine Congress starting this weekend.
The researchers at Jeroen Bosch Hospital in The Netherlands entered physicians’ notes and anonymised information about 30 patients, including exams, symptoms and lab results, into two versions of ChatGPT.
They found an overlap of around 60 per cent between the shortlist of possible diagnoses from emergency doctors and the chatbot.
“We found that ChatGPT performed well in generating a list of likely diagnoses and suggesting the most likely option,” said study author Dr Hidde ten Berg, from the emergency medicine department at Jeroen Bosch Hospital, in a statement.
“We also found a lot of overlap with the doctors’ lists of likely diagnoses. Simply put, this indicates that ChatGPT was able suggest medical diagnoses much like a human doctor would”.
The emergency doctors had the correct diagnosis in their top five lists 87 per cent of the time while ChatGPT version 3.5 had the correct diagnosis in its shortlist 97 per cent of the time compared to 87 per cent for ChatGPT version 4.0.
The research, as a proof of concept, was not used to impact patients but rather to test the potential or feasibility of using generative AI for diagnosis.
But it’s not something that will be available to use clinically just yet.
“One of the problems, at least in Europe…is that legislation is very tough,” said study author Steef Kurstjens, from the department of clinical chemistry and haematology at Jeroen Bosch Hospital, told Euronews Next.
“So these kinds of tools are not medical devices. So if you use them to affect patient care, you’re using a tool that is not a medical device as a medical device, and that is not allowed. So, I think new legislation needs to [be passed] if you want to use this,” he added.
Patient data privacy is another big concern around using generative AI in healthcare, with some experts urging policymakers to try to reduce any potential risks through regulation.
One of the more exciting uses of AI in healthcare could be in saving time for doctors, helping them to make diagnoses or alleviating some of the administrative burden in the health system, experts say.
“As a supportive tool, it could help physicians create [a diagnosis] list or get ideas that they wouldn’t have thought of themselves or for less experienced physicians that are still in training, this could really be a tool to support their daily care,” Kurstjens told Euronews Next.
“I think that the future of more medically related large language models that are trained on medical data is really interesting, like Med-PaLM, other kinds of large language models that are trained on medical data. That’s really interesting to see how they would perform if they would outperform ChatGPT,” he added.
The researchers also suggested that there was potential for saving time and reducing wait times in emergency departments.
Youri Yordanov from the St. Antoine Hospital emergency department in Paris, who is also chair of the emergency medicine congress’ abstract committee this year, said in a statement that doctors are a long way from using ChatGPT clinically.
Yordanov, who was not involved in the research, added that it’s important to study the technology and see how it could help doctors and patients.
“People who need to go to the emergency department want to be seen as quickly as possible and to have their problem correctly diagnosed and treated,” he said.
“I look forward to more research in this area and hope that it might ultimately support the work of busy health professionals”.