Experts raise concerns over AI hallucinations in popular platforms like ChatGPT – WJLA

Now
50
Wed
67
Thu
72
by Lindsey Mastis
If you asked Artificial Intelligence (AI) programs a question, would you know whether its answer was accurate? Experts are studying something called AI hallucinations. Simply put, it’s when information seems correct, but it’s inaccurate.
There are numerous news reports about platforms like ChatGPT giving incorrect information. Duke University Libraries outlines issues with fake citations, Forbes published a news story about how ChatGPT fooled scientists with fake abstracts, and Fortune cited a study showing ChatGPT went from answer math problem correctly 98% of the time to just 2%.
There’s a question about whether ChatGPT is simply making up information, misinterpreting data, or relying on misinformation.
ALSO READ | 7News On Your Side: Parents navigate the world of AI with their children
7News On Your Side’s Lindsey Mastis went to ChatGPT and asked the AI if it experiences hallucinations. It said it did not.
“No, I do not experience AI hallucinations. I generate responses based on patterns and information present in the data I was trained on up until my last update in September 2021,” it wrote.
But it admitted its responses may be incorrect due to other factors.
“My responses are not influenced by hallucinations or imaginary content, but they are generated based on the input provided to me and the knowledge I have been trained on. However, it's essential to note that AI models like mine can produce incorrect or inaccurate information if the input data or query is flawed, but this is not the same as hallucinations in the traditional sense,” it wrote.
SEE ALSO | 7News On Your Side: Don't be afraid of AI — you're probably already using it
Usha Jagannathan, an AI Innovation Leader, has studied hallucinations and offers some concrete advice.

She said some of the places you can check the information include Google Scholar for studies and reports, primary sources, and experts in the field.

source

Jesse
https://playwithchatgtp.com