Skip to content


Rethinking testing and risk within complex systems – the Swiss Cheese Model

Andrew Brown

Testing a simple system is relatively straightforward. We can usually identify a list of requirements or expected behaviours, then develop test scenarios to verify the system behaves as expected. Similarly, addressing failure within a simple system is often also straightforward. We can usually trace the failure back to a single primary cause, which we label as a defect. We can then fix the defect and hence address the failure.

However, testing a complex system, or addressing failure within it, is a fundamentally different problem. Usually, there are too many possible combinations to test all possible pathways through the system. In addition, failure cannot usually be traced back to a single primary cause, but rather failure is caused by several contributing factors. Moreover, none of these factors can be properly labelled as a defect that we can fix, because each factor forms an integral part of normal operations within the system.

Therefore, we need to rethink how to avoid failure within complex systems, plus how to test those systems to find, comprehend and communicate issues, in order that we can keep these complex systems both functioning and safe.

In this session, we use an example from commercial aviation to explore how complex systems work and, more importantly, the conditions under which they are prone to failure. We learn 10 principles of how complex systems fail, plus we learn ways in which our current approach to testing complex systems is fundamentally flawed.

Finally, we explore an alternate model for managing the testing and operation of complex systems – the Swiss Cheese Model. We show how we can use this model to change the way we think about complex systems and their primary causes of error. We then show how you can use this alternate model to map out a different approach to understanding and managing the risks inherent within them.


  • Preventing failure in complex systems is fundamentally different from preventing failure in simple systems
  • Complex systems are hazardous by nature and cannot be protected in the same way we protect simple systems
  • Learn 10 principles of how complex systems fail