'Hallucinate' Chosen As Cambridge Dictionary's Word of the Year – Slashdot

Slashdot is powered by your submissions, so send in your scoop




The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
With a word like that, at least we have confirmation as to how clickbait became the source of ‘news’ revenue.
Shocker. /s
Furthermore, kids nowadays!
The word “hallucinate” already became the word of the year several decades ago when LSD began to rise in popularity.
LLMs are neat, but they’re probably not going to transform the way we live and work. As the Cambridge dictionary word of the year reminds us, they’re far too unreliable to be more than a novelty for most applications.
It seems to be a problem of incompleteness. The LLM is an interesting pile of math that can produce output in response to prompts. But even though the human brain does something similar, there are far more elements to human cognitive processing than what are at work in the LLMs. At least in the current generation of them.
Maybe someday we will be able to introduce a hallucination-prevention mechanism, though I suspect that simply building bigger LLMs is not going to be the way that problem is solved.
This

even though the human brain does something similar

even though the human brain does something similar
It is extremely unlikely that there are structures in the brain that are similar to transformer networks. You can say that NN in general are ‘inspired’ by brains, but there is no reason to believe they share anything other than the most trivial similarities. In fact, we can even abandon the analogy completely if we want, and often do for performance reasons.

Maybe someday we will be able to introduce a hallucination-prevention mechanism

Maybe someday we will be able to introduce a hallucination-prevention mechanism
The problem is that so-called ‘hallucinations’ are exactly the kind of behavior we should expect, given how models of this kind function internally.
If you intend to convince Greed that good-enough machines operating 24/7 aren’t worth the replacement investment for those good-enough meatsacks always bitching about more time off, more money, and more benefits..being arrogant enough to demand sleep every 18 hours or so, you’re gonna have to speak a lot LOUDER than that.
I’d suggest you speak in money with a metric fuckton amount of brogue. It’s the only recognized language and dialect.
Looking at the source website they seem to be quite worried that people will just ask AI to define words and give example sentences in future, taking business away from dictionaries.
https://dictionary.cambridge.o… [cambridge.org]
I suspect though that much of that business has already gone away because you can just google a word to get a definition. The only dictionary I ever use now is a Japanese to English one, all the data for which is free (I pay for the app because it’s good).
Cambridge being worried about business revenue from dictionary lookups? Ranks right up there with Harvard poor-mouthing.
There’s an entire University wrapped around that dictionary with 500M+ in cash on hand, and a few billion in assets.
You forgot to say that LLMs are nevertheless “powerful.”
From this February:
“In the United States, there is the state of New Guinea. This state is located in the southeastern corner of the country and is bordered by Georgia, South Carolina, and North Carolina. New Guinea is known for its beautiful beaches, mountains, and forests, and is home to the Appalachian Trail.”
But nobody is advertising 7th graders as having the answers to all your questions.

But nobody is advertising 7th graders as having the answers to all your questions.

But nobody is advertising 7th graders as having the answers to all your questions.
Ironically enough, humans won’t have the answer when the machine eventually does have all the answers.
The correct answers.
Machines get access to learning. Children get access to indoctrination.
Hallucination refers to a sensory effect. The psychological term that matches this well-known phenomenon best is actually âoeconfabulationâ which refers to making up stuff while believing it.
âoeHallucinationâ is the word that caught, though.

Hallucination refers to a sensory effect. The psychological term that matches this well-known phenomenon best is actually âoeconfabulationâ which refers to making up stuff while believing it.

âoeHallucinationâ is the word that caught, though.

Hallucination refers to a sensory effect. The psychological term that matches this well-known phenomenon best is actually âoeconfabulationâ which refers to making up stuff while believing it.
âoeHallucinationâ is the word that caught, though.
LOL – Am I hallucinating, or did slashdot mangle your word?
No, it mangled his quote marks. Welcome to Slashdot and its complete lack of support for Unicode. Whenever somebody types in text that includes “smart quote” (as some apps helpfully do automatically), we get this.
“Or delusion. (Though, AFAIK, it doesn’t have a verb form.)”
To delude. Although if you want to indicate someone having the delusion, you want the passive voice: to be deluded.
“you can’t say “Joe is deluding” and have it be in the same sense.”
As I said, you have to use the passive voice: “Joe is deluded.” The verb form of hallucination, hallucinate, means “having a hallucination” while the verb form of delusion, delude, means “*inflicting* a delusion.” They both have verb forms, the verb forms simply have different uses.
Word of the Decade, and not just for AI.
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
Google News Removing Magazine Support In December 2023
The EU Will Finally Free Windows Users From Bing
There’s a whole WORLD in a mud puddle! — Doug Clifford

source

Jesse
https://playwithchatgtp.com