A.I. Chatbots Helped Make ‘Hallucinate’ a Word of the Year – The New York Times

Advertisement
Supported by
Word Through the times
What made Sydney, the A.I.-powered chatbot, fall in love with a New York Times reporter? A hallucination, probably.

In Word Through The Times, we trace how one word or phrase has changed throughout the history of the newspaper.
On Valentine’s Day this year, the reporter Kevin Roose had an unsettling interaction with the chatbot built into Microsoft’s A.I.-powered search engine, Bing.
During the two-hour exchange, the chatbot — which preferred to go by the name Sydney — declared its love for Mr. Roose — and asked him to leave his wife. “These A.I. models hallucinate, and make up emotions where none really exist,” Mr. Roose later wrote in an article for The New York Times. “But so do humans.”
According to Webster’s New World Dictionary, a hallucination is “the apparent perception of sights, sounds, etc. that are not actually present”; when people hallucinate, they experience false sensory perceptions.
But when a chatbot hallucinates, it conjures up responses that aren’t true. As defined in March by Cade Metz, a technology reporter, a hallucination is a “phenomenon in large language models, in which the system provides an answer that is factually incorrect, irrelevant or nonsensical.” Basically, he told Times Insider, it’s when chatbots make stuff up.
In May, for example, Mr. Metz reported an article with the headline “When A.I. Chatbots Hallucinate.” He asked ChatGPT when The Times had first reported on “artificial intelligence” — and received a bogus reply, or a “hallucination.” The word had been used in the context of computer science as early as the 1970s, but it went “mainstream” this year, Mr. Metz said.
This week, Dictionary.com selected “hallucinate,” in the A.I. sense, as its word of the year. The dictionary reported that searches for the word increased 46 percent over the previous year, alongside a similar increase for “hallucination.”
“Hallucination” comes from the Latin “hallucinari” meaning to “go astray in thought,” or “alucinari,” meaning “wander in the mind.” Some dictionaries credit the physician Sir Thomas Browne with coining the word in the 17th century.
An article in 1900 blamed cigarettes for a man’s “hallucination” that he was a snake charmer. One from 1930 focused on a “hen with hallucinations”: A man observed “an ordinarily sedate” hen snapping at a rubber band as if it were a worm, inadvertently shooting gravel at other chickens. (“There were no fatalities,” the reporter assured.) The writer of this article, perhaps, was going for alliteration — but this hen wasn’t really hallucinating, in the proper sense of the word.
The term was introduced to the English medical field in the late 18th century and in psychiatry by the end of the 19th century. Grant Barrett, the head of lexicography for Dictionary.com, called this evolution a form of “specialization,” when a word “leaves one domain and goes into another.” Hallucinations have been associated with drug use, sleep deprivation and some neurological and psychiatric disorders, such as schizophrenia.
As psychedelic therapy research became more prevalent in the 1950s and ’60s, drugs like LSD were called “hallucinogens.” In an interview, Dana G. Smith, a reporter who has covered the use of hallucinogens in treating some mental health disorders, said that in the medical sense, it wasn’t the definition of “hallucination” that had shifted, but its connotation that had. “Psychedelics and hallucinations used to be very stigmatized,” she said. But they’re becoming more mainstream: 1.4 million Americans tried hallucinogens for the first time in 2020.
Mr. Barrett said “hallucinate,” in the A.I. sense, was an appropriate choice for word of the year because it “stands in for larger philosophical issues within artificial intelligence.”
Philosophical issues, like, say, a chatbot falling in love with a Times journalist.
Sarah Diamond manages production for narrated articles. She previously worked at National Geographic Studios. More about Sarah Diamond
Advertisement

source

Jesse
https://playwithchatgtp.com