Learning Hardware Security Via Capture-The-Flag Competitions

By Jason M. Fung, Offensive Security Research Manager at Intel

Software security has been studied by many for decades. As attackers find new ways to break through protections, defenders learn and harden their design accordingly. As it becomes increasingly challenging to find low hanging fruit in the software layer, attackers naturally move down the stack to look for ways to compromise systems in the hardware layer. It is paramount for system designers to gain proficiency in securing hardware design and stepping up hardware security assurance efforts.

The good news is that through initiatives driven by the industry and collaboration with academia, we now have more resources available to educate hardware designers about secure design and assurance practices. The community-driven Hardware Common Weakness Enumeration (CWE) is an excellent example of this kind of industry effort. The latest CWE 4.2 release offers a catalog of 75 commonly overlooked mistakes that undermine the security robustness of a hardware design. Each entry includes illustrative examples along with guidance for identifying and mitigating the concerns. This valuable primer enables designers to methodically learn from the weakness patterns and address relevant gaps in their products.

People acquire and master skills in different ways. Security education through an industry primer works well for some, while others may find it easier to harness critical skills through a hands-on, collaborative effort.

Capture the Flag (CTF) competitions have always been an engaging tool to help participants learn, practice, and share hacking skills with one another. Organizers hide secrets, or “flags,” in a target system protected by layers of security controls and challenges, while participants compete to find as many flags as fast as they can. While traditional CTFs do cover a broad set of targets and skills, hardware design is an area that had long been overlooked.

Solid Collaboration Between Industry and Academia

Hack@DAC and Hack@Sec are hardware-specific CTF competitions that offer fun and educational ways to learn about security mistakes commonly made by hardware designers as they develop complex products like System-on-chips (SoCs). The first of their kind in the industry, these CTFs are the result of strong industry and academia partnerships, fostered through a long history of successful collaborations.

A co-organizer of the hardware CTFs, Professor Ahmad Reza Sadeghi leads the System Security Lab at Technische Universität Darmstadt in Germany and has collaborated on security research projects with Intel for more than a decade. Most recently, he is playing an influential role as the Director of Intel Collaborative Research Institute leading a group of international researchers on resilient autonomous systems research.

Professor Jeyavijayan Rajendran runs the Secure and Trustworthy Hardware Lab at Texas A&M University. His long-lasting collaboration with Intel started as early as his summer research visit in 2012, and it led to his eventual partnership with Intel in launching the inaugural Hack@DAC CTF at the Design Automation Conference (DAC) in 2018.

With a shared vision and passion to raise security capability for the hardware design community, security experts from Intel and these partners from academia collaborate to design a hands-on hacking and learning experience that effectively enable participants to gain deeper appreciation for the challenges involved in designing security robust hardware. To date, more than 150 teams have participated in these hardware CTF events. Participants come from diverse backgrounds and domain expertise; from security researchers and university students to hardware designers and EDA tool experts from the industry. Many that have taken part are convinced that more work needs to be done as an industry, and some were even inspired to take on personal missions to lead research and initiatives to make building secure hardware easier.

How Hardware CTF Competitions Work

Organizers start by taking a sophisticated open-source SoC and hardening it with various industry-like security protections, before carefully introducing a series of security vulnerabilities representing various Hardware CWEs for participants to find. There are multiple instances of each weakness type throughout the design, across a broad range of difficulty levels, to mimic the realistic challenges faced by SoC verification teams and appeal to participants with varying expertise.

The first stage of these competitions is a warmup in which teams have three months to review the SoC design and compete to find as many bugs as they can. Participants submit descriptions of the issue, root cause, security impact, valid test case or exploit and proposed mitigation. Judges score based on quality and completeness. Judges award bonus points to those that create and use automated tools to speed up the process. Teams with the highest scores move on to the second round, a live competition during which they use their experiences and any tools or techniques developed in the first stage to analyze the same buggy SoC design. This time however, the design includes new security protections and a new set of security vulnerabilities, and teams only have 48 hours to hack.

Key Takeaways

Academic researchers have historically been focused on a niche set of hardware security problems such as supply chain risks, physical attacks and cryptographic primitives. While these efforts remain significant, the industry can also benefit from research that helps address mainstream challenges, including systemic mitigations of common hardware weaknesses, automated detection techniques, secure hardware design patterns, and more. Analyzing a buggy SoC forces participants to uncover and learn about a wide range of often-overlooked hardware security issues, including misconfigured security settings in embedded firmware, faulty access controls enforced by hardware and more. Throughout the process, CTF participants learn about the ways logic- and design-related weaknesses can be carelessly introduced by hardware designers, as well as the security impact those vulnerabilities can have if left unchecked.

Hardware CTFs offer environments that mirror the pressure and constraints security assurance teams often experience in the real world. It helps participants appreciate the practical challenges that might not otherwise be obvious to them. Because there are more vulnerabilities inserted into the design than participants can find manually in the allotted time, they understand how powerful automated solutions can be when it comes to helping organizations become more proactive and productive in secure hardware development. The lack of available commercial and open source automation solutions also prompts participants to appreciate the critical gaps faced by practitioners that do the work every day.

Building a Foundation for Better Hardware Security

By open-sourcing the SoC framework and bug list to the entire industry, we can extend the value of the CTF competitions beyond the events. The publicly available infrastructure allows researchers to test and benchmark new hardware security scanning tools, develop and demonstrate the values of novel systemic mitigations, experiment with secure design patterns, and continue learning about hardware security weaknesses.

As attackers extend their focus to the hardware layer, improved hardware security practices and capabilities are imperative. Building robust, secure hardware requires more focus and stronger collaboration among industry and academia stakeholders. Hardware CTF competitions offer a fun and educational medium through which participants gain firsthand experience of the challenges hardware designers face every day. In addition to building critical security skills, participants are often inspired to take part in efforts to help the broader community to produce safe, secure hardware that can enrich the lives of every person on earth.

About the Author

Jason M. Fung AuthorJason M. Fung is the Director of Academic Research Engagement and Offensive Security Research at Intel. He has over two decades of experience in product architecture, penetrating testing, path-finding research, risk management and security assurance consultation.

December 11, 2020

cyber defense awardsWe are in our 11th year, and Global InfoSec Awards are incredibly well received – helping build buzz, customer awareness, sales and marketing growth opportunities, investment opportunities and so much more.
Cyber Defense Awards

12th Anniversary Global InfoSec Awards for 2024 are now Open! Take advantage of co-marketing packages and enter today!

X