Hallucination
/həˌluːsɪˈneɪʃən/
Definition
When an LLM generates plausible-sounding but incorrect or fabricated information not grounded in its training data.
Example in context
"The model hallucinated a non-existent citation — always validate LLM outputs against authoritative sources."
Related terms
Practice this term
Master Hallucination in context by working through exercises in the Data Science & ML module. You'll see the term used in real engineering scenarios with multiple-choice, fill-in-the-blank, and matching drills.