9 September 2015

Seven Teams Hack Their Way to the 2016 DARPA Cyber Grand Challenge Final Competition

OUTREACH@DARPA.MIL
7/8/2015

Seven teams from around the country have earned the right to play in the final competition of DARPA’s Cyber Grand Challenge (CGC), a first-of-its-kind tournament designed to speed the development of automated security systems able to defend against cyberattacks as fast as they are launched. The winners successfully squared off against dozens of other teams for the opportunity to compete head to head next year for nearly $4 million in prizes—and the chance to help revolutionize cybersecurity going forward.

Computers are important for detecting known network vulnerabilities and the swarms of malicious programs that are constantly seeking to take advantage of those weaknesses, but cyber defense today still ultimately depends on experts to patch those weaknesses and stymie new attacks—a process that can take months or longer, by which time critical systems may have been breached. CGC aims to automate the cyber defense process to identify weaknesses instantly and counter attacks in real time.

Out of 104 teams that had originally registered in 2014, 28 teams made it through two DARPA-sponsored dry runs and into last month’s CGC Qualifying Event. In that contest, teams tested the high-performance computers they had built and programmed to play a round of “capture the flag” (CTF)—a game that experts use to test their cyber defense skills. CTF games require competitors to reverse engineer software created by contest organizers and locate and heal its hidden weaknesses in networked competition. The CGC final event will take place in Las Vegas in August 2016, in conjunction with DEF CON, home of the longest-running annual CTF competition for experts.

“After two years of asking ‘What if?’ and challenging teams around the world with a very difficult series of preliminary events, we’ve shown that there is a place for computers in an adversarial contest of the mind that until now has belonged solely to human experts,” said Mike Walker, DARPA program manager. “As we had hoped when we launched this competition, the winning teams reflect a broad array of communities—academic pioneers of the field, security industry powerhouses, and veterans of the CTF circuit, each of which brings to CGC its own strengths.”

Each team designed an innovative system that achieves, to varying degrees, the difficult task of finding and fixing software safety problems in the kind of code used everywhere every day. “The results bode well for an exciting competition next year and confirm the value of using a grand challenge format,” Walker said. “With no clear best approach going in, we can explore multiple approaches and improve the chances of producing groundbreaking improvements in cybersecurity technology.”

The CGC Qualifying Event from which the seven winning teams emerged: 
was the first CTF played solely by machines 
operated at a speed and scale at which only machines can compete. For example, most CTF events challenge experts to analyze and secure about 10 pieces of software over 48 hours. The CGC Qualifying Event demanded that teams’ machines work on 131 pieces of software—more than any previous CTF event—over just 24 hours. Some teams’ systems secured single pieces of software in less than an hour. 
resulted in participating teams together fixing all of the 590 flaws in the competition software of which the contest developers were aware. 

Most CGC competitors entered on an open track available to self-funded teams, while seven teams participated on a funded track with DARPA support. The three funded-track teams heading to the CGC finals are: 
CodeJitsu (Berkeley, Calif.): A team affiliated with the University of California, Berkeley 
ForAllSecure (Pittsburgh, Pa.): A startup founded by a team of computer security researchers from Carnegie Mellon University 
TECHx (Charlottesville, Va.): Software analysis experts from GrammaTech, Inc., a developer of software assurance tools and advanced cybersecurity solutions, and the University of Virginia 

The four winning open-track teams are: 
CSDS (Moscow, Idaho): A professor and post-doctoral researcher from the University of Idaho 
DeepRed (Arlington, Va.): A team of engineers from the Raytheon Company 
disekt (Athens, Ga.): Four people, working out of a technology incubator, who participate in CTF competitions around the world 
Shellphish (Santa Barbara, Calif.): A group of computer science graduate students at the University of California, Santa Barbara Each qualifying team will receive $750,000 to help them prepare over the next 13 months for the CGC final competition. They will have the opportunity to access a specialized IT infrastructure, a “digital arena” in which they can practice and refine their systems against dummy opponents that DARPA is providing. For its part, DARPA is developing custom data visualization technology to make it easy for spectators—both a live audience and anyone watching the event’s video stream worldwide—to follow the action in real time during the final contest.

The winning team from the CGC final competition will receive $2 million. Second place will earn $1 million and third place $750,000. More important to Walker than the prize money, however, is igniting the cybersecurity community’s belief that automated cybersecurity analysis and remediation are finally within reach.

“We want an automation revolution in computer security so machines can discover, confirm and fix software flaws within seconds, instead of waiting up to a year under the current human-centric system,” Walker said. “These capabilities are essential for protecting data and processes as more and more devices, including vehicles and homes, get networked in the ‘Internet of things.’”

Additional details about the Cyber Grand Challenge and photos of the finalist teams can be found at www.cybergrandchallenge.com.

Associated images posted on www.darpa.mil and video posted at www.youtube.com/darpatv may be reused according to the terms of the DARPA User Agreement, available here: http://www.darpa.mil/policy/usage-policy.

No comments: