Study says AI chatbots churn out 'racist' medical information – Fox News

This material may not be published, broadcast, rewritten, or redistributed. ©2023 FOX News Network, LLC. All rights reserved. Quotes displayed in real-time or delayed by at least 15 minutes. Market data provided by Factset. Powered and implemented by FactSet Digital Solutions. Legal Statement. Mutual Fund and ETF data provided by Refinitiv Lipper.
Fox News contributor Dr. Marc Siegel weighs in on how artificial intelligence can change the patient-doctor relationship on "America’s Newsroom."
A study found that artificial intelligence chatbots such as the popular ChatGPT return common debunked medical stereotypes about Black people.
Researchers at Stanford University ran nine medical questions through AI chatbots and found that they returned responses that contained debunked medical claims about Black people, including incorrect responses about kidney function and lung capacity, as well as the notion that Black people have different muscle mass than White people, according to a report from Axios.
The team of researchers ran the nine questions through four chatbots, including OpenAI’s ChatGPT and Google’s Bard, that are trained to scour large amounts of internet text, the report noted, but the responses raised concerns about the growing use of AI in the medical field.
ARTIFICIAL INTELLIGENCE HELPS DOCTORS PREDICT PATIENTS’ RISK OF DYING, STUDY FINDS: ‘SENSE OF URGENCY’
A study found that artificial intelligence chatbots such as the popular ChatGPT return common debunked medical stereotypes about Black people. (Gabby Jones / Bloomberg via Getty Images / File)
“There are very real-world consequences to getting this wrong that can impact health disparities,” Stanford University assistant professor Roxana Daneshjou, who served as an adviser on the paper, told the Associated Press. “We are trying to have those tropes removed from medicine, so the regurgitation of that is deeply concerning.”
William Jacobson, a Cornell University law professor and the founder of the Equal Protection Project, told Fox News Digital that immaterial racial factors making their way into medical decision-making has long been a concern, something that could worsen with the spread of AI.
“We have seen DEI and critical race ideology inject negative stereotypes into medical education and care based on ideological activism,” Jacobson said. “AI holds out the potential of assisting in medical education and care that is focused on the individual. AI should never be the only source of information, and we would not want to see AI politicized by manipulating the inputs.”
ChatGPT is shown on a computer. (Frank Rumpenhorst / picture alliance via Getty Images / File)
CLICK HERE FOR MORE US NEWS
Phil Siegel, the founder of the Center for Advanced Preparedness and Threat Response Simulation, told Fox News Digital that AI systems do not have “racist” models but noted biased information based on the information set it draws on.
Google’s Bard was included in the study. (Marlena Sloss / Bloomberg via Getty Images / File)
“This is a perfect example of ‘Pillar 3’ of regulation that has to be managed for AI,” Siegel said. “Pillar 3 is ‘ensuring fairness’ – to not allow current biases get hard-coded in the datasets and models that would cause unfair prejudice in areas such as health care, hiring, financial services, commerce and services. Obviously, some of that is occurring today.”
CLICK HERE TO GET THE FOX NEWS APP
Neither Google nor OpenAI immediately responded to a Fox News request for comment.
Get all the stories you need-to-know from the most powerful name in news delivered first thing every morning to your inbox
Subscribed
You’ve successfully subscribed to this newsletter!
This material may not be published, broadcast, rewritten, or redistributed. ©2023 FOX News Network, LLC. All rights reserved. Quotes displayed in real-time or delayed by at least 15 minutes. Market data provided by Factset. Powered and implemented by FactSet Digital Solutions. Legal Statement. Mutual Fund and ETF data provided by Refinitiv Lipper.

source

Jesse
https://playwithchatgtp.com