πŸ“– Step 9: AI/LLM#276 / 291

Hallucination

Hallucination

πŸ“–One-line summary

When a model fabricates plausible-sounding but false information.

πŸ’‘Easy explanation

When an AI confidently invents things that aren't true. Sounds plausible but is wrong β€” always verify.

✨Example

Q. What city was King Sejong born in?

A. Busan (confidently said) ⚠️

βœ• Actual answer: Hanseong (today's Seoul)

Plausible-sounding but false answers

⚑Vibe coding prompt examples

>_

Design an automated eval pipeline that measures hallucination rate, covering both ground-truth and open-ended questions.

>_

Beyond "say I don't know," list system-prompt patterns that reduce hallucinations.

>_

Lay out a procedure for reproducing and debugging a hallucination once it's found, including what to log.

Try these prompts in your AI coding assistant!