Engineering Trustworthy Systems: A Principled Approach to Cybersecurity

cyber shield, illustration - Credit: VectorStock

Engineering Trustworthy Systems: A Principled Approach to Cybersecurity
Communications of the ACM, J
Contributed Articles
By O. Sami Saydjari

“Students of cybersecurity must be students of cyberattacks and adversarial behavior.”

 Cyberattacks are increasing in frequency, severity, and sophistication. Target systems are becoming increasingly complex with a multitude of subtle dependencies. Designs and implementations continue to exhibit flaws that could be avoided with well-known computer-science and engineering techniques. Cybersecurity technology is advancing, but too slowly to keep pace with the threat. In short, cybersecurity is losing the escalation battle with cyberattack. The results include mounting damages in the hundreds of billions of dollars, erosion of trust in conducting business and collaboration in cyberspace, and risk of a series of catastrophic events that could cause crippling damage to companies and even entire countries. Cyberspace is unsafe and is becoming less safe every day.

 

The cybersecurity discipline has created useful technology against aspects of the expansive space of possible cyberattacks. Through many real-life engagements between cyber-attackers and defenders, both sides have learned a great deal about how to design attacks and defenses. It is now time to begin abstracting and codifying this knowledge into principles of cybersecurity engineering. Such principles offer an opportunity to multiply the effectiveness of existing technology and mature the discipline so that new knowledge has a solid foundation on which to build.
“Engineering Trustworthy Systems” contains 223 principles organized into 25 chapters. This article will address 10 of the most fundamental principles that span several important categories and will offer rationale and some guidance on application of those principles to design. Under each primary principle, related principles are also included as part of the discussion.

 

For those so inclined to read more in Engineering Trustworthy Systems, after each stated principle is a reference of the form “{x.y}” where x is the chapter number in which it appears and y is the y-th principle listed in that chapter (which are not explicitly numbered in the book).

Motivation

Society has reached a point where it is inexorably dependent on trustworthy systems. Just-in-time manufacturing, while achieving great efficiencies, creates great fragility to cyberattack, amplifying risk by allowing effects to propagate to multiple systems {01.06}. This means that the potential harm from a cyberattack is increasing and now poses existential threat to institutions. Cybersecurity is no longer the exclusive realm of the geeks and nerds, but now must be considered as an essential risk to manage alongside other major risks to the existence of those institutions.

 

The need for trustworthy systems extends well beyond pure technology. Virtually everything is a system from some perspective. In particular, essential societal functions such as the military, law enforcement, courts, societal safety nets, and the election process are all systems. People and their beliefs are systems and form a component of larger societal systems, such as voting. In 2016, the world saw cyberattacks transcend technology targets to that of wetware—human beliefs and propensity to action. The notion of hacking democracy itself came into light, posing an existential threat to entire governments and ways of life though what is sometimes known by the military as influence operations {24.09}.

 

Before launching into the principles, one more important point needs to be made: Engineers are responsible for the safety and security of the systems they build {19.13}. In a conversation with my mentor’s mentor, I once made the mistake of using the word customer to refer to those using the cybersecurity systems we were designing. I will always remember him sharply cutting me off and telling me that they were “clients, not customers.” He said, “Used-car salesmen have customers; we have clients.” Like doctors and lawyers, engineers have a solemn and high moral responsibility to do the right thing and keep those who use our systems safe from harm to the maximum extent possible, while informing them of the risks they take when using our systems.

 

In “The Thin Book of Naming Elephants,” the authors describe how the National Aeronautics and Space Administration (NASA) shuttle-engineering culture slowly and unintentionally transmogrified from that adhering to a policy of “safety first” to “better, faster, cheaper.” This change discouraged engineers from telling truth to power, including estimating the actual probability of shuttle-launch failure. Management needed the probability of launch failure to be less than 1 in 100,000 to allow launch. Any other answer was an annoyance and interfered with on-time and on-schedule launches. In an independent assessment, Richard Feynman found that when engineers were allowed to speak freely, they calculated the actual failure probability to be 1 in 100. The engineering cultural failure killed many great and brave souls in two separate shuttle accidents.

 

I wrote “Engineering Trustworthy Systems” and this article to help enable and encourage engineers to take full charge of explicitly and intentionally managing system risk, from the ground up, in partnership with management and other key stakeholders.

Read the Full Article »

About the Author:

O. Sami Saydjari is Founder and President of the Cyber Defense Agency, Inc., Clarksville, MD, USA.

 See also:Book Cover: Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time