The Fundamental Limits 7 December 2025·1755 words·9 mins Hallucinations Indeterminacy Grounding RAG Citation-Verification Knowledge-Graphs Legal-AI-Accuracy Shepards KeyCite ABA-Opinion-512 Sanctions Stanford-RegLab Why hallucination is an architectural feature of LLMs, not a bug — and what that means for legal AI