π Step 9: AI/LLM#276 / 291
Hallucination
Hallucination
πOne-line summary
When a model fabricates plausible-sounding but false information.
π‘Easy explanation
When an AI confidently invents things that aren't true. Sounds plausible but is wrong β always verify.
β¨Example
Q. What city was King Sejong born in?
A. Busan (confidently said) β οΈ
β Actual answer: Hanseong (today's Seoul)
Plausible-sounding but false answers
β‘Vibe coding prompt examples
>_
Design an automated eval pipeline that measures hallucination rate, covering both ground-truth and open-ended questions.
>_
Beyond "say I don't know," list system-prompt patterns that reduce hallucinations.
>_
Lay out a procedure for reproducing and debugging a hallucination once it's found, including what to log.
Try these prompts in your AI coding assistant!