Shameless Guesses, Not Hallucinations

Source: Astral Codex Ten
by Scott Alexander

“I hate the term ‘hallucinations’ for when AIs say false things. It’s perfectly calculated to mislead the reader — to make them think AIs are crazy, or maybe just have incomprehensible failure modes. AIs say false things for the same reason you do. At least, I did. In school, I would take multiple choice tests. When I didn’t know the answer to a question, I would guess. Schoolchild urban legend said that ‘C’ was the best bet, so I would fill in bubble C. … So the interesting question isn’t why AIs hallucinate: during training, guessing correctly is rewarded, guessing incorrectly isn’t punished, so the rational strategy is to always guess (and increase your chance of being right from 0 to 0.001%). Since AIs in normal consumer use follow the strategies they learned during training, they guess there too.” (03/16/26)

https://www.astralcodexten.com/p/shameless-guesses-not-hallucinations