LiteLLM — CVE-2026-42208 SQL Injection Exploited Within 36 Hours
AI relevance: LiteLLM is the most widely deployed open-source AI gateway, centralizing API keys and routing across OpenAI, Anthropic, AWS Bedrock, Azure OpenAI, and dozens of other model providers — making it a single point of failure for AI infrastructure credentials.
- CVE-2026-42208 is a SQL injection in BerriAI's LiteLLM proxy authentication path, allowing authenticated or unauthenticated database reads depending on deployment configuration.
- Attackers exploited the flaw in the wild within roughly 36 hours of the advisory, before many teams completed triage.
- The attack targeted LiteLLM's credential tables, which typically store OpenAI organization keys, Anthropic workspace credentials, AWS Bedrock IAM roles, and virtual API keys.
- Because LiteLLM acts as a centralized routing layer, a single compromised proxy can expose credentials for every model provider an organization uses.
- The blast radius extends beyond data leakage — stolen provider keys enable unlimited model usage billing, access to fine-tuned models, and potential lateral movement into cloud consoles.
- LiteLLM's ubiquity (thousands of production deployments) made it a high-value, low-friction target for credential-hunting operations.
Why it matters
AI gateways are becoming the identity and access management layer for LLM usage. When a proxy layer stores every provider key in a single database, a SQL injection in that layer is equivalent to a credential store breach. The 36-hour exploitation window demonstrates that attackers are actively scanning for vulnerable AI infrastructure, not waiting for public PoCs.
What to do
- Update LiteLLM to the patched version immediately (BerriAI/litellm).
- Rotate all LLM provider API keys stored in or accessible from the LiteLLM proxy database.
- Audit LiteLLM database access logs for suspicious queries targeting credential tables.
- Consider deploying LiteLLM behind a WAF with SQL injection rules as defense-in-depth.
- Separate provider credential storage from the proxy database using a secrets manager (Vault, AWS Secrets Manager) with short-lived tokens.