Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

DARPA and ARPA H unveil AI Cyber Challenge results; Team Atlanta wins $4 million

DARPA / ARPA H presentation at DEF CON 33 · September 26, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

At DEF CON 33, DARPA and ARPA H said finalist systems in the AI Cyber Challenge autonomously found and patched hundreds of vulnerabilities in large open-source codebases; DARPA and ARPA H will release the finalist tools open source and awarded top prizes including $4 million to Team Atlanta.

At DEF CON 33, Andrew Carney, a program manager at the Defense Advanced Research Projects Agency (DARPA) and ARPA H, announced results from the AI Cyber Challenge, saying finalist teams autonomously discovered and patched hundreds of vulnerabilities in real open-source code and that core finalist technologies will be released open source.

Carney said the competition used over 54,000,000 lines of real open-source code across chosen repositories and that teams were given 70 known challenge vulnerabilities and discovered additional previously unknown vulnerabilities. "The average time to patch is under an hour," Carney said, describing final-round performance and faster remediation compared with earlier rounds.

Jim O'Neill, deputy secretary of the Department of Health and Human Services, framed the competition in health-sector terms, saying escalating cyberattacks have harmed care and citing 2018–2024 figures of more than 650 ransomware incidents, nearly 89,000,000 patient records affected and about $22,000,000,000 in downtime costs. O'Neill said ARPA H has committed $20,000,000 to help translate winning solutions into real-world health uses and urged continued industry and community engagement. "Our AICC will accelerate discovery, unlock innovation, and most important, deliver real solutions to patients and providers faster," O'Neill said.

O'Neill and Carney provided slightly different tallies of challenge findings from the stage: Carney said teams found 18 zero-days in the finals, while O'Neill summarized AICC results as identifying 52 synthetic zero-days in Java and C and patching 47, and identifying 18 real zero-days and patching 11. The organizations said they are still validating results and completing disclosure to maintainers before releasing full data and artifacts to the public.

Steven Winchell, director of DARPA, said the agency is running 14 programs and investing more than $650,000,000 in computer-security efforts. Winchell and Carney praised the finalist teams for combining large-language models with traditional program-analysis techniques, and said four of the competition’s core response systems (CRSs) are available open source immediately, with the remaining finalist systems to be released in the coming weeks.

The session also covered resource use and costs: Carney said semifinal teams had strict resource limits (examples included a $100 LLM credit cap per project in early rounds), while finals removed most constraints; combined finalist spend in the final round was nearly $360,000, with roughly 82% used for LLM credits, producing nearly 2,000,000 model queries.

DARPA and ARPA H said they had added $1,400,000 in additional prize funding to support adoption of the competition technologies. Onstage prize awards named Team Theory as third place, Trail of Bits as second place (a $3,000,000 prize) and Team Atlanta as first place (a $4,000,000 prize). Carney invited maintainers and project owners to engage with AICC via aicaixcc@darpa.mil and said the competition data archive and infrastructure will be released over the coming months.

The presentations emphasized that finalist systems are intended as tools maintainers and infrastructure owners can use to speed vulnerability discovery and remediation; organizers said they will continue coordinated disclosure and engagement with maintainers before further public releases. Attendees were invited to the AICC experience to inspect generated patches, proofs of vulnerability and visualizations of system interactions.