I think, indeed ‘hallucinating’ is not the correct term for LLMs delivering incorrect content.
Hallucinating: ‘to seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug’ (source Cambridge dictionary)
An LLM does not really perceive something. It rather derives wrong information from the information that it has been exposed too.
Confabulation: filling in of gaps in memory by fabrication (source Merriam Webster)
Hi @Frank-Rene_Schafer ,
The way I see it, these terms are usually defined without absolute regards to actual meanings conveyed in regular conversations. You may have a point, but in the context of large language models, the definition of hallucination is accepted by the community.
Hi @lukmanaj ,
Respectfully - No. A careful, precise, and accurate nomenclature requires to take the linguistic context into account. The term ‘hallucination’ guides people onto a wrong path of thinking. Since the term ‘confabulation’ covers the phenomenon very well, no new terminology is required. The usage of ‘hallucination’ in the aforementioned context must be subject to terminological critique.
I understand the point you’re trying to make, but I would like you to remember that language itself is fluid and semantics actually change due to popular usage.
Confabulation is good, but if somebody says confabulating vs hallucinating, then the meaning is lost, because the first meaning of confabulating is “engage in conversation; talk.”.