Cambridge Dictionary has a new Word of the Year for 2023 - 'hallucinate' - and while you may recognize the term instantly, the boom in generative artificial intelligence (AI) has given it a new meaning.
Traditionally the verb, according to the one of the world's most popular dictionaries, is defined as "to seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug." However, thanks to the surge in AI tools in the last year, Cambridge has now given it an added meaning: "when an AI hallucinates, it produces false information."
The Cambridge Dictionary team said it chose the new definition because it "gets to the heart of why people are talking about AI," explaining that while generative AI remains a "powerful tool", it is one we are all still learning how to interact with safely and effectively. "This means being aware of both its potential strengths and its current weaknesses," it said, highlighting the fact that it is capable of producing false information – so-called 'hallucinations' – and presenting this information as fact.
READ MORE:
How AI is helping to tackle online hatred
'Precise and targeted': Israeli forces raid Gaza's largest hospital
Palestinians dig mass grave to bury dead at Gaza's largest hospital
Also known as confabulations, AI hallucinations are often nonsensical, but they can also seem entirely plausible, despite being factually inaccurate or ultimately illogical. One example cited by the U.S. tech firm IBM includes Google's Bard chatbot incorrectly claiming that the James Webb Space Telescope had captured the world's first images of a planet outside our solar system. Another was Microsoft's chat AI, Sydney, admitting to falling in love with users.
Some hallucinations can prove to be even more problematic, with Meta removing its Galactica LLM demo in 2022, after it provided users inaccurate information, some of it rooted in prejudice. Echoing such issues, the dictionary's choice, it said, raised the importance of keeping check on how much we rely on AI tools while they remain in their infancy.
"The fact that AIs can 'hallucinate' reminds us that humans still need to bring their critical thinking skills to the use of these tools," said Wendalyn Nichols, Cambridge's publishing manager. "AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray. At their best, large language models can only be as reliable as their training data."
Another point of interest in choosing the word, according to experts, was how we have started to give AI more and more human features in the language we use to describe its features. "The widespread use of the term 'hallucinate' to refer to mistakes by systems like ChatGPT provides a fascinating snapshot of how we're thinking about and anthropomorphizing AI," said Dr Henry Shevlin, an AI ethicist at the University of Cambridge.
"Inaccurate or misleading information has long been with us, of course, whether in the form of rumors, propaganda, or 'fake news'. Whereas these are normally thought of as human products, 'hallucinate' is an evocative verb implying an agent experiencing a disconnect from reality. This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one 'hallucinating'," he added.
While the term's explosion didn't suggest a widespread belief in AI sentience, according to Shevlin, it underscored our willingness to give AI "human-like attributes." He added: "As this decade progresses, I expect our psychological vocabulary will be further extended to encompass the strange abilities of the new intelligences we're creating."
Other than the term hallucinate, Cambridge has added several other new words in 2023 related to the rapid developments in AI and computing. They include 'GenAI', an abbreviation for generative AI, and 'large language model', a mathematical representation of language based on very large amounts of data that allows computers to produce language similar to what a human might say.
Outside of technology, several other words also saw spikes in public interest and searches on the Cambridge Dictionary website. Choice examples included 'implosion', thanks to the Titan submersible's implosion, 'grifter', someone who gets money dishonestly by tricking people, and 'GOAT', an abbreviation for Greatest Of All Time. The latter term jumped in popularity thanks to the Qatar World Cup, which provoked new debates about who was the GOAT in football: Lionel Messi, Cristiano Ronaldo, or one of the late greats like Pele or Diego Maradona.
Animation: James Sandifer