The Big Picture

The Big Picture
Communications of the ACM, November 2018, Vol. 61 No. 11, Pages 24-26
Inside risks
By Steven M. Bellovin, Peter G. Neumann

“Cryptography is an enormously useful concept for achieving trustworthy systems and networks; unfortunately, its effectiveness can be severely limited if it is not implemented in systems with sufficient trustworthiness.

Trustworthiness is a total-system problem. That is, trustworthiness must consider not just attributes of individual elements, but also how they compose and interact. It is not uncommon for systems to fail even when every individual component is correct and seems locally secure.

In operation, user wisdom and sensible behavior are often assumed (instead of building people-tolerant systems), and the creativity and power of malicious misuse and malware are inadequately considered. Thus, trustworthiness must anticipate all sorts of human behavior, as well as environmental disruptions.

It is time to get serious about the dearth of trustworthy systems and the lack of deeper understanding of the risks that result from continuing on a business-as-usual course.”

 

Previous Communications inside risks columns have discussed specific types of risks (to safety, security, reliability, and so on), and specific application areas (for example, critical national infrastructures, election systems, autonomous systems, the Internet of Things, artificial intelligence, machine learning, cybercurrencies and blockchains—all of which are riddled with security problems). We have also considered risks of deleterious misuses of social media, malware, malicious drones, risks to privacy, fake news, and the meaning of “truth.” All of these and many more issues must be considered proactively as part of the development and operation of systems with requirements for trustworthiness.

 

We consider here certain overarching and underlying concepts that must be better understood and more systematically confronted, sooner rather than later. Some are more or less self-evident, some may be debatable, and others may be highly controversial.

 

  • A preponderance of flawed hardware-software systems, which limits the development of trustworthy applications, which also impedes accountability and forensics-worthy rapid identification of culprits and causes failures.
  • Lack of understanding of the properties of composed systems. Components that seem secure locally, when combined, may yield insecure systems.
  • A lack of discipline and constructive uses of computer science, physical science, technology, and engineering, which hinders progress in trustworthiness, although new applications, widgets, and snake-oil-like hype continue apace without much concern for sound usability.
  • A lack of appreciation for the wisdom that can be gained from science, engineering, and scientific methods, which impedes progress, especially where that wisdom is clearly relevant.
  • A lack of understanding of the short-term and long-term risks by leaders in governments and business, which is becoming critical, as is their willingness to believe that today’s sloppy systems are good enough for critical uses.
  • A widespread failure to understand these risks is ominous, as history suggests they will pervasively continue to recur in the future.
  • A general lack of awareness and education relating to all of these issues, requiring considerable rethinking of these issues.

Read the article »

About the Authors:

Steven M. Bellovin is a professor of Computer Science at Columbia University, and affiliate faculty at its law school.

Peter G. Neumann is Chief Scientist of the SRI International Computer Science Lab, and moderator of the ACM Risks Forum.

Both Peter and Steven have been co-authors of several of the cited NRC study reports, and co-authors of Keys Under Doormats.