Artifacts of Nuclear Safety Culture – Design

In my previous article on nuclear safety culture ( ), I gave a few examples of artifacts of nuclear safety culture. Let’s take a closer look at these artifacts and of their relation to the basic underlying assumptions. But before that, let’s recall the definition of the artifacts, as presented by Professor Edgar Schein, in his book Organizational Culture and Leadership: “At the surface is the level of artifacts, which includes all the phenomena that you would see, hear, and feel when you encounter a new group with an unfamiliar culture. Artifacts include the visible products of the group, such as the architecture of its physical environment; its language; its technology and products; its artistic creations; its style, as embodied in clothing, manners of address, and emotional displays; its myths and stories told about the organization; its published lists of values; and its observable rituals and ceremonies.”

What would be the specific artifacts of nuclear safety culture in design? First of all, the basic design of a nuclear power plant does not necessarily reflect the safety culture of the operating organization in the first place, but it is more indicative of the safety culture of the designer (vendor) organization and of the safety culture of the regulatory authority of the country of origin of the design. Of course, nuclear power plants are designed in accordance with technical standards and requirements, most of which are very prescriptive and leave little room for interpretation. However, these technical standards and requirements, which are themselves an example of artifacts, are established by committees of specialists, evolve over time, using the construction and operating experience, the results of relevant research activities and the technological developments and design upgrades for improved safety and reliability are continuously identified on this basis.

An operating organization’s openness to seek and implement design improvements is a strong indication of its safety culture, as well as of its risk aversion. New design features and upgrades installed pro-actively to improve nuclear safety, by initiative of the operating organization, demonstrate a healthy commitment to the continuous improvement of nuclear safety. Such improvements may be implemented also as a result of external pressure, being imposed by the regulatory authorities or strongly recommended by peer reviews, but in this case they do not carry the added significance of safety culture.

An operating organization’s participation in owner group initiatives and research programs for improving nuclear safety, its effective use of operational experience feedback, its financial budget for investments in safety upgrades, the completeness and accuracy of its design basis documentation and well as the competence of its own engineering and technical support personnel are all manifestations of the traits of a healthy nuclear safety culture and leading indicators of its safety performance in the area of design. In particular, the preoccupation with improving human factors engineering in design is also a positive indication of the operating organization’s safety culture.

For illustrating how design artifacts reflect basic underlying assumptions, we can use a famous example of a design feature that proved its value in reducing the consequences of a nuclear accident: “Cockcroft’s Folly”. Sir John Douglas Cockcroft, the director of the Atomic Energy Research Establishment of the UK in the 1950’s, insisted that the chimney stacks of the Windscale plutonium production reactors be fitted, at great expense, with high performance filters. Because the improvement was requested after the stacks of the Windscale reactors had been designed, they resulted in a lumpy shape to the chimneys. The filters were called “Cockcroft’s folly,” because other engineers involved in the project thought they were not necessary. But what at that time seemed like an excess of caution, ultimately demonstrated its utility, when Windscale Pile No. 1 caught fire in 1957 and the filters greatly limited the releases of radioactive material (by approximately 95%), thus reducing significantly the radiological consequences for the population and the environment. (

With his conservative attitude and his determination in promoting an important design safety feature, Sir John Douglas Cockcroft represents a notable example of leadership in the area of accident prevention and mitigation. His basic underlying assumptions about the necessity and utility of the design improvement came from his scientific and technical background, from his understanding of the reactor design, of its possible modes of failure and of their consequences and well as from his preoccupation with safety.

The basic underlying assumptions of those who get to decide on the design of a nuclear power plant are shaped by their scientific and engineering knowledge, by their own experience, by their use and understanding of technical standards, of systems and structures failure modes and effects analyses, of deterministic and probabilistic nuclear safety analyses, hazard analyses and operating experience, by their own risk aversion and by their perception of what is credible and what is acceptable. In order to standardize the design process and assumptions, we use technical standards, but these are not fully harmonized at international level and differences still exist.

Dr. Schein’s mode of organizational culture is dynamic and if we try to understand it we will see not only how basic assumptions can lead to espoused values and generate artifacts, but also how the artifacts can influence the basic assumptions.

In order to understand and promote nuclear safety culture in all nuclear power plant activities, including in those pertaining to design, we need to understand “nuclear”, we need to understand “safety”, we need to understand “culture” and the connections between these concepts. Using Dr. Schein’s definition of culture (“a pattern of shared basic assumptions learned by a group as it solved its problems of external adaptation and internal integration, which has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way to perceive, think, and feel in relation to those problems”) we can try and think of the shared basic assumptions of reactor safety design, of how they evolved over time and of how they are reflected in the documentations, the systems and the organizations we work with. In order to make sure these basic assumptions are correct, known, understood and used to support and improve nuclear safety, by all relevant personnel, we need to clearly identify the design standards used, the assumptions used in the safety analyses, the bases for these assumptions and also to rely on competent engineering personnel, with suitable knowledge, qualifications, experience and attitudes, including in the application of human performance tools for engineers and other knowledge workers (practices for anticipating, preventing, and catching in-process errors). Only in this way we can demonstrate, as nuclear professionals, that we recognize and treat nuclear power as special and unique.

To conclude, if we want to better understand the nuclear safety culture of an organization that operates nuclear power plants, we can look into many areas – design, operation, maintenance, training – and try and see how the artifacts that we see correspond to the espoused values and what the alignment between the two can tell us about the basic underlying assumptions.

In the areas of design, we may look at the following aspects (the list is not exhaustive):

  • Design upgrades implemented and  / or initiated in the last 10 years to improve nuclear safety, including those needed for increased protection against severe accidents; use of deterministic and probabilistic safety analyses to identify and design safety improvements;
  • Completeness of the design basis documentation; design basis reconstitution activities and reverse engineering activities;
  • Effectiveness of the design configuration management process;
  • Availability and use of current technical standards, access to operators and owner groups’ platforms for operational experience sharing, access to relevant research programs and their results, use of innovation;
  • Use of human factors engineering to improve safety;
  • Application of the redundancy, independence, diversity, fail-safe principles in the design of the safety systems; the systematic evaluation and implementation of the defence-in-depth concept;
  • Selection, training and qualification of engineering and technical support personnel; the availability, in-house, of sufficient numbers of personnel that have in-depth understanding of the design bases;
  • Cultivation of the “Intelligent customer capability”;
  • Operating experience events having design deficiencies as causes and the corrective actions implemented.

And remember, whenever we find discrepancies between relevant artifacts (the concrete things that we see) and espoused values (the declared principles and values), we should question the basic assumptions and try and understand where these come from, how valid they are and how they can impact nuclear safety.

This entry was posted in Nuclear Safety Culture, Uncategorized. Bookmark the permalink.

Comments are closed.