Tying an AI's output to verified external facts (documents, databases, search results) to reduce hallucination. RAG is the most common technique.
"The agent is grounding every claim with a citation now. Nice UX improvement."
No comments yet — say something.
Add your own interpretation of "grounding".