A Holistic View of Future Risks

lines suggest a cityscape, illustration - Credit: Getty Images

A Holistic View of Future Risks
Communications of the ACM, October 2020, Vol. 63 No. 10, Pages 23-27
Inside Risks
By Peter G. Neumann

“Pandora’s Cat Is Out of the Barn, and the Genie Won’t Go Back in the Closet. – Willingness to accept and respond to reality is fundamental to avoiding risks.”

 

This column considers some challenges for the future, reflecting on what we might have learned by now—and what we systemically might need to do differently. Previous Inside Risks columns have suggested that some fundamental changes are urgently needed relating to computer system trustworthiness.a Similar conclusions would also seem to apply to natural and human issues (for example, biological pandemics, climate change, decaying infrastructures, social inequality), and—more generally—being respectful of science and evident realities. To a first approximation here, I suggest almost everything is potentially interconnected with almost everything else. Thus, we need moral, ethical, and science-based approaches that respect the interrelations.

 

Some commonalities across different disciplines, consequent risks, and what might need improvement are considered here. In particular, the novel coronavirus (COVID-19) has given us an opportunity to reconsider many issues relating to human health, economic well-being (of individuals, academia, and businesses), domestic and international travel, all group activities (cultural, athletic, and so forth), and long-term survival of our planet in the face of natural and technological crises. However, there are also some useful lessons that might be learned from computer viruses, malware, and inadequate system integrity, some of which are relevant to the other problems—such as computer modeling and retrospective analysis of disasters, supply-chain integrity, and protecting whistle-blowers.

 

A quote from Jane Goodall in an interview in April 2016 seems more broadly relevant here than in its original context: “If we carry on with business as usual, we’re going to destroy ourselves.” The same is true of my quote from the early crypto wars regarding export controls: “Pandora’s Cat Is Out of the Barn, and the Genie Won’t Go Back in the Closet.” We are apparently reaching a crossroads at which we must reconsider potentially everything, and especially how it affects the future.

Priorities Among Competing Goals

Human civilization does not tend to agree among issues such as fairness, equality, safety, security, privacy, and self-determination (for example). With COVID-19, economical well-being, health care, climate change, and other issues (some of which are considered here), if we cannot agree on the basic goals, we will never reach whatever they might have been—especially if the goals appear to compete with each other.

The Importance of Fundamental Principles

Numerous principles for computer system security and integrity have been known for many years, and occasionally practiced seriously. Some corresponding principles might be considered more broadly in the combined context of risks in engineering computer-related systems, but also in natural systems.

 

Albert Einstein wrote “It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.”b This is often paraphrased as “Everything should be made as simple as possible, but no simpler.” Although the longer statement could be thought of as applicable to trying to explain things as they are (for example, the universe), the simplified version (“should be made”) is also fundamental to the development of new computer systems, as well as in planning proactively for potential catastrophes and collapsing infrastructures.

 

This principle, together with principles relating to transparency, accountability, and scientific integrity, suggest dealing openly and appropriately with risks, while being respectful of science and reality throughout. For example, we tend to make huge mistakes by stressing short-term gains (particularly financial), while ignoring the long-term risks (including everything else). Unfortunately, the gains are unevenly distributed, as the rich get richer, and the poor tend to get poorer and suffer much more.

 

The principles relating to completeness are particularly critical to computer system design, implementation, applications, and human interfaces, but also regarding responses to pandemics, climate change, and the planet’s environment—along with their implications for human health and well-being, and resource exhaustion of rare elements.

 

A closely related principle of pervasive holism invokes a big-picture view of the Einstein principle, in which everything is potentially in scope unless explicitly ruled out—for example, for reasons of impossibility, feasibility, or perhaps for mistaken decisions about costs, when the long-term overall benefits would dramatically outweigh the short-term savings. Pervasive holism represents the ability to consider all relevant factors, and the ensuing risks. It is relevant broadly across many disciplines. For example, it is essential in the design of computer-communication systems. It encourages systems to be designed to compensate for a wide range of threats and adversities, including some that might not be anticipated a priori. Similarly, climate change is causatively linked with extreme weather conditions, melting glaciers, more disastrous fires, human activities, fossil fuels, changes in agriculture, and—with nasty feedback—greater demands for air conditioning and refrigerants such as hydro-fluorocarbons that are making the problems worse. On the positive side, atmospheric and sea changes have been observed during the pandemic shutdown (with reduced fuel consumption and much less travel), reinforcing arguments that alternatives to fossil fuels are urgently needed (especially as they are becoming increasingly economical and competitive).

 

Many nations have clearly realized that careful application of scientific analysis is always desirable, but it can be misused or misapplied. In confronting pandemics, massive immunization programs must be preceded by extensive testing, without which they can have serious consequences (including organ failures, deaths, iatrogenic effects, and in some cases allergic reactions such as anaphylaxis). In pharmaceuticals, some effects are disingenuously called ‘side-effects’—whereas in many cases these effects are well known to have occurred (and are often extensively enumerated in the labeling). However, the effects of deforestation, pesticides, toxic environments (water, air, polluted oceans), non-recyclable garbage, overuse of antibiotics, and so on should by now all be well recognized as long-term risks.

 

In today’s novel coronavirus and its ongoing mutations, a holistic approach requires anticipating human physical and mental health factors, and their interactions with economic factors and social equality (all persons are supposedly created equal, but usually not treated accordingly—but what about other creatures?), along with future implications, globally rather than just locally. It also requires understanding potential long-term damage—for example, effects on heart, brain, and other organs are still unknown. Fully anticipating the consequences of insurance policies that would not allow existing preconditions is also a major issue, in light of the huge numbers of COVID-19 infections worldwide. Equality in almost everything is desirable, especially in education when home schooling is impossible, broadband access is spotty or nonexistent, and the lack of ubiquitous Internet-accessible devices is a show-stopper for many children. Equal opportunity to vote is also critical, but is being badly abused. Furthermore, spreading disinformation and other forms of disruption can be especially damaging in all of the preceding cases. So, many of these issues are actually interrelated. As one further example of the extent of interrelationships and interlocking dependencies, the realization that arctic glacial melting is releasing methane and possibly ancient viruses from earlier pandemics is also relevant.

 

Principles involving controllability, adaptability, and predictability require better understanding of the importance of a priori requirements, as well as the vagaries of models, designs, development, implementation, and situational awareness in real time. These are vital in computer system development. In pandemics, these principles should help reduce the uncertainties of taking different approaches to limiting propagation of contagion, severity of cases, duration of disruption, extent of acquired immunities, and above all a willingness to accept reality and scientific knowledge.

A caveat is needed here: The preceding principles can be used effectively by people who deeply understand the fields in which they are working—and who also have a willingness to work well with colleagues with a better understanding of other areas. In the absence of such knowledge and willingness, the principles are likely to be very poorly misapplied. Humility is a virtue in this regard.

Read the Full Article »

About the Author:

Peter G. Neumann is Chief Scientist of the SRI International Computer Science Lab, and has moderated the ACM Risks Forum since its beginning in 1985. He is grateful to Prashanth Mundkur and Tom Van Vleck for helping considerably enrich the holistic perspective in this column.

See also: