5 minute read

Chapter

Next Article
Chapter

Chapter

However, the really great commitment to, and enthusiasm for, the study came when I met Professor Bent Natvig from the University of Oslo. He was passionately interested in how to analyze the reliability and risk associated with technological systems. It was not as easy to calculate probabilities for these systems as for games. Nevertheless, one could assess reliability and risk. The key was to create mathematical models of the systems and their components. By expressing the probabilities of the failure of the individual system components, one was able to compute the reliability of the entire system. Natvig led me into the research world. He was concerned with developing new theory and new methods that could improve the understanding and calculations of the reliability of these systems. Previously, the models had been based on the assumption that the system and its components either worked or did not work. Natvig’s research involved expanding the analyses to also cover situations where the system and its components could have more than two states, for example reflecting partial functioning. It was extremely exciting to participate in this work.

I was barely 25 years old, and I did not ask questions about the assumptions for the calculations and analyses. I was not concerned that the models used could be based on incorrect assumptions or that the component probabilities were based on more or less relevant data. Although the analyses dealt with reliability and risk, the subject area of the work was in reality strictly within mathematical statistics.

It was not until the 1990s that I became seriously concerned with questions that challenged the assumptions and frameworks for the analyses. By then, I had worked for a few years with practical problems in industry. I discovered that, in the real world, the mathematics and the calculus of probability did not quite make it. The calculations could be comprehensive and elegant, but what if they overlooked important risk sources?

In a way, this realization – that it was necessary to look beyond the numbers – was sad; my subject and discipline was not sufficient to be able to answer the central questions related to the understanding, communication, and handling of risk. On the other hand, it opened up new and interesting perspectives. If I were to be able to say something wise and useful about understanding, communicating, and managing risk related to practical issues, I had to expand my thinking and approach to risk.

I became interested in finding out what other professionals had thought and done. I quickly realized that my thoughts and concerns were not unique and original. Many other professionals and experts had pointed out the limitations of mathematics in this context. In particular, I noticed that there were many professionals with social science and psychological backgrounds who were critical of the technical-mathematical approach to risk. They pointed to the models’ weaknesses and the need to more strongly emphasize human and organizational aspects. Research was conducted which showed that people perceived many risks quite differently than the experts did.

I understood much of the criticism leveled against the technical-mathematical approach, but should we just throw this approach overboard? Was it not better to try to develop it further, improve it, and face the criticism? Yes, and many professionals and researchers have been concerned with just that: to combine technical-mathematical methods with ideas and perspectives from social science studies and psychology.

Central to this work is so-called integrative thinking. A good example of such thinking can be found in the understanding of what risk really is. The technical-mathematical approach gave the concept of risk a numerical interpretation that was unable to capture important risk aspects. On the other hand, the perspectives from the social sciences and psychology emphasized uncertainty and the perception of risk. The

differences in thinking between these areas gave rise to tensions and discussion. However, unifying, integrating solutions have been found: Risk must not generally be seen as a number. Accident numbers and probabilities give a measure of the risk, but these measures can be more or less good at describing the risk. Uncertainty about what will happen is a more fundamental aspect of risk than probability. This recognition provides opportunities for a new understanding of risk, which unites the different perspectives. People’s perceptions of risk may have their origin in this uncertainty and may explain why, in many cases, they do not share the experts’ understanding of risk, which is based on the probabilities derived in the risk assessments.

With such an expanded understanding of risk, where uncertainty is a central aspect, surprises and the unforeseen will also find their place. This was a subject area that I never addressed during my studies. Nor was it an issue we discussed when I worked with risk assessments in the 1990s in industry. Within safety research and the quality profession, however, there were many who were concerned with the challenges that surprises and the unforeseen represented when it came to avoiding accidents and losses. I became familiar with this literature only after Nassim Taleb published his book on black swans in 2007.2 This book was very inspiring. The metaphor ‘black swan’ explained in an easy-to-understand way what the topic of surprises and the unforeseen means.

The risk analysts and researchers have provided some answers on how to meet this challenge. One is to emphasize robustness and resilience. Events happen that surprise us. We can plan, based on what has happened in the past, and do our analyses, but they are not perfect. We must acknowledge this and develop ourselves, our systems, and our

2 Taleb, N.N. (2012) Anti Fragile. Penguin.

organizations, so that we are able to continue to function well when such events occur.

Another answer to the challenge related to possible surprises is to improve the risk assessments. This is necessary and possible to achieve. It is not enough to focus on probabilities. The knowledge on which these analyses are based can be more or less strong, even wrong. Surprises arise relative to this knowledge. This aspect of risk has not been given enough weight in the risk assessments. It is now a central topic in risk research, and new and improved approaches and methods are being developed.

This article is from: