The Naval Nuclear Power Program (NNPP) has an envious safety record (including statistics on miles steamed, etc.). The Safety Management System (SMS) appears sound and serves as a model for all communities. The major safety-program values of the NNPP are captured in its list of watchstanding principles: integrity, ownership, formality, level of knowledge, questioning attitude, procedural compliance, and forceful backup. These strong values serve as a great foundation for safety and are easy to understand and communicate into a practical context.
The program emphasizes competence and character and insists on developing high levels of both, as well as the credibility that results from those traits. This is paramount to maintaining the trust (and regulatory independence) required to operate nuclear power plants in home waters and abroad. Over the past 65 years, advances in technology, core design, and materials have made reactor-protection complexes and nuclear-power operations increasingly safer, and technological advances continue to improve safety. In an organization with so much rigor, the record truly speaks for itself. But can that same rigor that lends itself to an outstanding safety record also create cultural “threats” to that record and weaken components of its SMS?
Recent incidents of cheating in the NNPP have been answered with a determination to improve exam control and refocus on integrity within the program. Similar integrity “lapses” related to Continuous Training Examination cheating on submarines and training documentation for nuclear shipyard workers have also occurred recently.1 (The U.S. Air Force has also had its challenges.2) These were serious failures indicating conscious disregard for program standards and information assurance. The accountability measures were strong and appropriate; however, an examination of organizational and human decision-making factors contributing to these “personnel failures” is also necessary. Although these incidents may not come as a surprise to many current and former nukes, many are still asking, “What would motivate these professionals to make the decisions they did?” To understand this we need to understand “normalized deviance,” risk tolerance, and the impacts of NNPP program requirements on decision making.
Influence of Normalized Deviance
In the 1997 book Challenger Launch Decision, Diane Vaughan identified three key faults that led to the Challenger disaster: competing projects and resource scarcity, regulatory ineffectiveness, and organizational factors (specifically, failure in the organizational and professional culture).3 These faults yielded normalized deviance, instances of groupthink, and compliance issues. According to Vaughn, “Social normalization of deviance means that people within the organization become so much accustomed to a deviant behavior that they don’t consider it as deviant, despite the fact that they far exceed their own rules for elementary safety.” Research of normalized deviance in the healthcare industry and others reveals that, “These deviations or rule violations are rarely motivated by malice or greed, but often result from personnel feeling intense performance pressures.”4 This happens for several reasons:
• The rules are stupid and inefficient. In this case the workers will develop shortcuts and workarounds when the rule, regulation, or standard seems irrational or inefficient.
• Knowledge is imperfect and uneven. Workers know the rules exist but fail to appreciate the purpose.
• The work itself, along with new technology, can disrupt work behaviors or rule compliance. New technologies and personnel can force workers to devise novel responses to new challenges.
• I’m breaking the rule for the good of others. Justification of deviation is when the rule or standard is perceived to be counterproductive.
• The rules don’t apply to me/you can trust me. When system operators believe they are not tempted to engage in the behavior the rule is supposed to deter, it is perceived as superfluous.
• Workers are afraid to speak up. The likelihood that rule violations will become normalized increases if witnesses or those in the know refuse to intervene.
• Leadership withholding or diluting findings on system problems. A supervisor may be abundantly aware of standard or rule violations, but be fearful that if their supervisors knew about them, the person and unit would look bad to organizational leadership. Furthermore, efforts to correct standards violations might be perceived as too time-consuming and as threatening to cause short-term productivity losses.
The NNPP cheating cases are examples of normalized deviance and although these failures were not cataclysmic, they continue to demonstrate how broader program policy requirements can impact behavior, specifically normalized deviance and risk tolerance.
Hazards and Risks
Consider that in the course of a typical day, literally millions of risk decisions are made by sailors and Marines. From the time we wake up, we are engaged in activities that involve hazards and risks. The risk decision-making process often only takes seconds, but can result in outcomes with significant cost for the individual and organization. When hazards are identified and understood, individuals tend to make decisions to accept or reject risk influenced by one or more risk-tolerance factors, including potential for fiscal, emotional, or physical profit of gain in which perceived or actual gains (or avoidance of punishment) increase willingness to take risk.5 Also, when role models are observed accepting risk, especially when negative outcomes are avoided, tolerance increases for those who see this happen. Other factors such as pressure, stress, fatigue, and alcohol use can skew the decision-making process from the start; these factors, individually or in combination, affect our ability to identify hazards and understand outcomes and can influence how much risk we are willing to take.
As mentioned previously, normalized deviance is often encountered in high-pressure environments. The program (and Navy in general) has a mantra of “trust but verify.” However, a climate of continuous scrutiny coupled with an embedded fear of punishment may be translated into an atmosphere that says “we don’t trust you.” From the time young nukes are tested and accepted into the program, they are bombarded with messages of integrity and threats of expulsion from the program if lapses occur.
In the training pipeline they endure intense pressure to perform academically in a rigorously controlled, zero-defect environment. Required study hours are assigned based on academic performance, and study is monitored and logged. This scrutiny and oversight (labeled “continuous improvement”) is consistently applied throughout a nuke’s career—in the form of recertification, level-of-knowledge checks, audits, and inspections—and reinforces a focus on finding what’s wrong.
There is typically little time or focus on what the workforce does well. Corrective actions for failures are viewed as coercive or cumbersome, and opportunities to learn are viewed as “witch hunts.” The term critique—a detailed analysis and assessment of something—is viewed in its most negative form, and corrective actions result in additional work for an already overtasked force. With all this pressure, the opportunity for the manifestation of normalized deviance clearly exists even though the foundational value of integrity—absolute honesty, trustworthiness, and reliability in nuclear power plant training, qualification, operation, and maintenance—is continuously communicated. The exam cheating incident is an example.
Factors to Ponder
First, consider the requirements of a typical day under way or in home port: watchstanding, preventive and corrective maintenance, drills and casualty response, additional qualifications, continuing training requirements, damage-control training, General Military Training, collateral duties, physical readiness, and perhaps a bit of time for personal pursuits and sleep.
Then consider the time required to study all the potential exam material that could be tested. In context of normalized deviance, average nukes perceive this as unreasonable. Perceptions can develop that the exam is just a wicket, something to “get by” so one can study for the “real” examination contained in the oral board process. (This gets especially noteworthy when exam content focuses on irrelevant material.) As a result they may make the decision to develop shortcuts to learning; “gouge” used to reduce the potential content to a “digestible” size is one example. Compounding this are additional requirements exam failures yield. The person failing the exam is assigned an upgrade that requires additional study, and the staff and leadership end up doing the documentation and counseling that failures require. So although the hazards and outcomes of cheating are understood, excessive competing events, perceived pressure, and fear of punishment may drive nukes to increase their risk tolerance to ensure passing grades on exams.
Normalized deviance was identified in 2011 when former submarine nukes explained that the exam had become “so difficult that they have little to do with the skills sailors actually need” and, “It’s so common a practice that people don’t even realize that it’s wrong.”6 From discussions with current nukes, I’ve learned that the NNPP is taking an introspective look at some of the organizational factors that contributed to the cheating. Beyond acknowledging that exam security is weak, many other contributing factors were not mentioned in the media reports that highlighted the individual failures and accountability. The program does recognize a need to ensure “meaningful work” and provide “pathways to success” while evaluating “Building Up” and “Tearing Down” forces that can affect personal integrity. This is encouraging to hear, but NNPP leadership would be wise to ensure that they take inventory of the existing safety culture and where organizational pressures influence sailor risk tolerance. As covenant leaders, we should fully understand how program policies can lead to normalized deviance and ensure communication and meaningful supervision is effective (but not overbearing) to help prevent poor risk decision making.
Operator Commitment to Operational Excellence
NNPP leaders could take several leadership and policy approaches to reestablish nuclear-operator commitment to excellence. First, leverage the program’s strong learning and informed cultures and conduct fleetwide focus groups to solicit feedback on where normalized deviance occurs or is most likely to happen in the future. (Administrative processes and requirements appear to be areas where that is most likely to occur.) In 2012, then-Director of Naval Nuclear Propulsion Admiral Kirkland H. Donald sent a memo to NNPP leadership with the subject “The Leadership Challenge of Fostering a Culture of Integrity”.7 In it he referenced an article titled, Why We Lie and challenged these leaders to
Review the enclosed article and include your assessment of applicability to your command in your next periodic letter to me. I am particularly interested in your strategy and techniques for maintaining a climate that helps honest people behave honestly in the challenging business of naval nuclear propulsion operations.8
The techniques that were gathered should be resocialized across the community and into the education pipeline—especially for prospective reactor officers, chief engineers and bull nukes—and another round of leadership assessment should occur.
Second, reevaluate the written examination process and intent. We should be confident that an ongoing critical review of exam controls will help mitigate increased risk tolerance. Given the susceptibility of operators to normalized deviance, leadership must better communicate that exam controls are not meant to signal a lack of trust. Rather, they serve to protect risk to the individual’s and the program’s integrity, which is a key component of maintaining trust and credibility. In addition, consider incentives for positive exam performance. The negative outcomes are well understood. But how well is demonstrated competence acknowledged for written examination performance? Exam policy should be reviewed and adjusted to ensure content is relevant to the knowledge, skills, and abilities required for the associated watch station.
Many nukes remain frustrated with theoretical knowledge requirements that have no bearing on their ability to maintain and operate propulsion plants. In many cases, enlisted nukes feel they are required to master theoretical concepts they will never use. They know what theory is applicable and relevant to their jobs and what is not. Exam questions should be reviewed for relevancy. Efforts to develop a “force” exam bank appear to be a good start. Also, with the wide range of level-of-knowledge assessment techniques used (observed drills and evolutions, audits, interviews, oral boards, run time, etc.), exams should be reviewed for their utility, and consideration should be given to shedding exams that do not serve a practical purpose.
Third, consider alternatives to reduce workload demands and competing motivators. For example, explore the option of consolidating all nukes into a single Nuclear Operator rating with the option of a maintenance or operator track. This approach could significantly reduce the number of requirements nukes are currently required to fulfill. This model currently exists at Prototype and in the civilian nuclear-power industry.
Fourth, as ethics professor John Banja explains:
Remediating the normalization of deviancy begins with leadership’s requiring system operators to consistently renew their commitment. A powerful way of enabling that commitment is for leadership to model it and to foster an organizational environment that eradicates, as much as possible, factors that sustain rule and standards violations.9
NNPP leaders should consider how the inherent bureaucratic nature of the program influences behavior. Because the cost of outcome (reactor accident and resultant fission product release to the environment) is so high, program risk tolerance is understandably extremely low, resulting in a heavy reliance on controls and administrative “proof” of competence (objective quality evidence). But to what degree are potential policy-change efforts being held hostage by the specter of a reactor accident? The nature of the program’s bureaucratic construct may inhibit the program culture from moving toward more productive safety cultures.10 Accordingly, NNPP leaders should use inspirational appeals to shift community pride away from being developed as a result of shared commiseration toward pride in accomplishment and a sense of belonging to the elite cadre of the best nuclear operators and maintainers in the world. Program leaders should reflect on how effectively they are communicating core values and beliefs in ways that develop buy-in and ownership of the safety culture at the deckplate level. This is especially relevant considering our consistent struggle to retain nukes, even with the incentive of huge reenlistment bonuses.
Harking Back to Rickover
Finally, consider ways to strengthen the depth of incorporating human factors into root-cause analysis. In that regard, the “father of nuclear power,” Admiral Hyman G. Rickover, stated:
It appears that the human factors “program” is another of the fruitless attempts to get things done by systems, organizations, and big words rather than by people. It contains the greatest quantity of nonsense I have ever seen assembled in one publication. It is replete with obtuse jargon and sham-scientific expression which, translated into English from its characteristic argot—where this is possible—turns out to be either meaningless or insignificant. It is about as useful as teaching your grandmother how to suck an egg.11
Framed in the context of the times, Rickover’s methods and philosophy can be excused (although those in the naval aviation enterprise may disagree). However, as technological improvements made systems and operations safer and the collective corporate knowledge of hazards and risks gained through lessons learned strengthened, the need for that approach to human factors should have evolved as well. NNPP leadership should leverage the Human Factors Analysis and Classification System as both a reporting and hazard-identification tool to provide leadership a more comprehensive approach to identifying and mitigating human-factors problems.12
While this discussion has tried to capture normalized deviance and risk tolerance as factors in episodes of nuke-power cheating, the concept and recommendations are applicable to many other communities and units that operate under similar high-pressure, high-standard environments. The intent here is not to bring further scrutiny to already highly scrutinized nukes. Rather it is to provide a different perspective (not excuse) for why it happened. Doing the right thing applies to leadership as well. We have a responsibility to reflect on how our leadership and broader organizational factors and processes affect our people and then act with their best interests in mind. A recent Facebook post captured an excerpt from an undisclosed book that read: “Unless you give motivated people something to believe in, something bigger than their job to work toward, they will motivate themselves to find a new job and you’ll be stuck with whoever’s left.” We have the brainpower to make things better, but can we overcome the bureaucratic inertia to do so?
1. “Navy Kicks Out 34 for Nuke Test Cheating,” Associated Press, 21 August 2014, http://foxnews.com/us/2014/08/21/navy-kicks-out-34-for-nuke-cheating/. “Navy Probes Cheating on Submarine Nuclear Exams,” Associated Press, 16 November 2011, http://cbsnews.com/news/navy-probes-cheating-on-submarine-nuclear-exams/; Mike Hixenbaugh, “300 Restricted for Nuke Training Lapses at Naval Shipyard,” Virginian-Pilot, 1 October 2014, http://hamptonroads.com/2014/09/300-restricted-nuke-training-lapses-naval-shipyard/.
2. AMD John C. Harvey, USN, The Independent Review of the Nuclear Enterprise, November 2014, http://blog.usni.org/2014/11/25/the-independent-review-of-the-nuclear-enterprise.
3. Diane Vaughan, Challenger Launch Decision (Chicago: University Of Chicago Press, 1997).
4. John Banja, “The Normalization of Deviance in Healthcare Delivery,” National Library of Medicine, National Institutes of Health, Business Horizons, 1 January 2011, www.ncbi.nlm.nih.gov/pmc/articles/PMC2821100/.
5. Dave Fennell, Strategies for Understanding and Addressing Risk Tolerance (ExxonMobil Human Factors Center of Excellence, January 2011), https://safety.cat.com/cda/layout?m=677315&x=7&f=880098.
6. “Navy Probes Cheating on Submarine Nuclear Exams,” Associated Press, 16 November 2011, http://cbsnews.com/news/navy-probes-cheating-on-submarine-nuclear-exams/.
7. ADM K. H. Donald, “The Leadership Challenge of Fostering a Culture of Integrity,” Memo, 2012.
8. Dan Ariely, “Why We Lie,” Wall Street Journal, 26 May 2012, www.wsj.com/articles/SB10001424052702304840904577422090013997320.
9. Banja, “The Normalization of Deviance in Healthcare Delivery.”
10. Patrick Hudson, Safety Culture—Theory and Practice (Centre for Safety Science, Universiteit Leiden, The Netherlands, 1 January 2001), http://scholar.google.com/scholar_url?url=http://www.dtic.mil/cgi-bin/GetTRDoc%3FAD%3DADP010445&hl=en&sa=X&scisig=AAGBfm3wjyTuL7MTD_XFMrTD6zywnJOTbQ&nossl=1&oi=scholarr.
11. VADM Jerry Miller, USN (Ret.), Rickover in Print, U.S. Naval Academy, www.usna.com/NC/History/SeaStories/1942/Rickover.htm.
12. Scott A. Shappell, The Human Factors Analysis and Classification System—HFACS (National Technical Information Service, February 2000), www.nifc.gov/fireInfo/fireInfo_documents/humanfactors_classAnly.pdf.
Command Master Chief Kingsbury is stationed at the Naval Safety Center in Norfolk, Virginia.