Nuclear Safety - Safety Culture and Human Errors

Safety Culture and Human Errors

One relatively prevalent notion in discussions of nuclear safety is that of safety culture. The International Nuclear Safety Advisory Group, defines the term as “the personal dedication and accountability of all individuals engaged in any activity which has a bearing on the safety of nuclear power plants”. The goal is “to design systems that use human capabilities in appropriate ways, that protect systems from human frailties, and that protect humans from hazards associated with the system”.

At the same time, there is some evidence that operational practices are not easy to change. Operators almost never follow instructions and written procedures exactly, and “the violation of rules appears to be quite rational, given the actual workload and timing constraints under which the operators must do their job”. Many attempts to improve nuclear safety culture “were compensated by people adapting to the change in an unpredicted way”.

According to Areva's Southeast Asia and Oceania director, Selena Ng, Japan's Fukushima nuclear disaster is "a huge wake-up call for a nuclear industry that hasn't always been sufficiently transparent about safety issues". She said "There was a sort of complacency before Fukushima and I don't think we can afford to have that complacency now".

An assessment conducted by the Commissariat à l’Énergie Atomique (CEA) in France concluded that no amount of technical innovation can eliminate the risk of human-induced errors associated with the operation of nuclear power plants. Two types of mistakes were deemed most serious: errors committed during field operations, such as maintenance and testing, that can cause an accident; and human errors made during small accidents that cascade to complete failure.

According to Mycle Schneider, reactor safety depends above all on a 'culture of security', including the quality of maintenance and training, the competence of the operator and the workforce, and the rigour of regulatory oversight. So a better-designed, newer reactor is not always a safer one, and older reactors are not necessarily more dangerous than newer ones. The 1978 Three Mile Island accident in the United States occurred in a reactor that had started operation only three months earlier, and the Chernobyl disaster occurred after only two years of operation. A serious loss of coolant occurred at the French Civaux-1 reactor in 1998, less than five months after start-up.

However safe a plant is designed to be, it is operated by humans who are prone to errors. Laurent Stricker, a nuclear engineer and chairman of the World Association of Nuclear Operators says that operators must guard against complacency and avoid overconfidence. Experts say that the "largest single internal factor determining the safety of a plant is the culture of security among regulators, operators and the workforce — and creating such a culture is not easy".

Read more about this topic:  Nuclear Safety

Famous quotes containing the words safety, culture, human and/or errors:

    The safety of the republic being the supreme law, and Texas having offered us the key to the safety of our country from all foreign intrigues and diplomacy, I say accept the key ... and bolt the door at once.
    Andrew Jackson (1767–1845)

    ... we’ve allowed a youth-centered culture to leave us so estranged from our future selves that, when asked about the years beyond fifty, sixty, or seventy—all part of the average human life span providing we can escape hunger, violence, and other epidemics—many people can see only a blank screen, or one on which they project fear of disease and democracy.
    Gloria Steinem (b. 1934)

    Whoever considers morality the main objective of human existence, seems to me like a person who defines the purpose of a clock as not going wrong. The first objective for a clock, is, however, that it does run; not going wrong is an additional regulative function. If not a watch’s greatest accomplishment were not going wrong, unwound watches might be the best.
    Franz Grillparzer (1791–1872)

    There are two equal and opposite errors into which our race can fall about the devils. One is to disbelieve in their existence. The other is to believe, and to feel an excessive and unhealthy interest in them. They themselves are equally pleased by both errors and hail a materialist or a magician with the same delight.
    —C.S. (Clive Staples)