ProjectDiscovery — AI Code Deluge Overwhelms Security Teams

AI relevance: AI coding assistants are shipping code faster than security teams can audit it, introducing secrets leakage, supply-chain risks, and business logic vulnerabilities at a rate organizations aren't equipped to handle.

  • ProjectDiscovery surveyed 200 cybersecurity practitioners at mid-size to large enterprises in North America and Western Europe. Only 38% said they are keeping up with the volume of AI-generated code they must review; nearly 60% said the task is getting harder.
  • Top concerns: exposure of corporate secrets via AI tools (78%), supply-chain risks from unreliable AI-selected dependencies (73%), and business logic vulnerabilities — design flaws that let attackers abuse legitimate application functions (72%).
  • European respondents were significantly more concerned about secrets leakage than Americans (87% vs. 72%), likely reflecting GDPR-driven risk awareness.
  • Security personnel at mid-sized companies felt more pressure than those at large firms, reflecting the resource gap in AI-code review capacity.
  • A cited 2025 National Cybersecurity Alliance report found 43% of employees admitted to entering sensitive company data into AI tools — directly feeding the secrets-leakage risk.
  • ProjectDiscovery's conclusion: defenders are "not opposed to the technology, but they need it to earn its place" through audit trails and access limitations.

Why it matters

The gap between engineering velocity (accelerated by AI coding tools) and security review capacity is widening. This isn't a hypothetical risk — it's the mechanism by which AI-assisted development introduces new attack surfaces: leaked credentials in repos, vulnerable dependencies pulled in by AI suggestions, and architectural flaws that pass code review because they look syntactically correct.

What to do

  • Implement pre-commit scanning for secrets in all repositories, regardless of whether code is human- or AI-written.
  • Require dependency review gates for AI-suggested package additions — don't let AI silently add new transitive dependencies.
  • Build audit trails for AI coding tool usage: which prompts, which files changed, which model versions.

Sources: