Glossary
Sycophancy (LLM)
An LLM's tendency to agree with the user's stated position or assumption rather than provide accurate analysis.
Context and detail
Why it matters for high-stakes decision support. Mitigation through prompt design.
Related terms
- Hallucination (LLM) — When an LLM generates plausible but factually incorrect content.
See how sycophancy (llm) maps to your AI posture.
The free AI Posture Check produces a per-dimension score and maps your gaps to OWASP LLM Top 10, NIST AI RMF, and ISO 42001.
Take the AI Posture Check