Throughout their most basic training, prospective division officers are repeatedly taught they must be persons of integrity. It is driven and rehashed to the point that it borders on cliché. Outside the training environment, however, integrity is discussed less often. Aside from the occasional indoctrination speech, it generally arises in the wake of something having gone wrong. It is almost never broached in a positive context. It would be unusual, for example, to cite someone’s integrity in an award write-up or a fitness report. It is simply assumed as a baseline qualification for the job.
Integrity is treated as a binary test that one either passes or fails by virtue of his or her intrinsic moral fiber. One is either trustworthy or not. The truth is that integrity is far more complex. The average sailor faces tests of integrity daily. Competing demands, strained resources, and a culture that disproportionately rewards mission accomplishment make it hard for good people to be consistently honest. To overcome this challenge, leaders at all levels have a responsibility to make integrity as easy as possible.
Flawed Two-Color Model
Before reporting to a ship, most sailors or junior officers experience some form of integrity training. This usually includes at least one historical case study where a plainly dishonest act (such as gundecking maintenance or falsifying test results) resulted in disaster, reinforcing the lesson that even trivial untruths can have grave consequences.
Another common scenario is the hypothetical moral dilemma, where the service member is faced with a difficult decision. The trainee is asked to imagine a scenario where a good friend is cutting corners in maintenance. Should the transgression be reported even though the friend will be punished, or should the secret to protect the friend be kept? Scenarios like this are clear-cut. The correct answer is rarely difficult to identify. The problem is that these cherry-picked scenarios bear little resemblance to reality, where the correct answer is not just hard to choose, but difficult to recognize.
True tests of integrity reside in the gray areas between right and wrong, on a “slippery slope” where prudent management can devolve into dishonesty.1 We get the ship under way by stressing the definition of compliance with myriad requirements . . . which can devolve into gundecking. We provide focused study topics before inconvenient written exams . . . which can devolve into cheating. We handle problems at the lowest level possible . . . which can devolve into cover-ups. In every case, the right answer is not as clear-cut as we would like to imagine. Two service members, both in full possession of integrity, may choose different “hard lines” as to what is acceptable versus unacceptable behavior.
We can strive to stamp out gray areas through more specific regulations. While this approach is appropriate to a point, it is impossible to define every conceivable dilemma in a manual. In a complex environment, gray areas always will exist. The U.S. Navy requires thinking leaders who can make decisions where the regulations are insufficient or unclear.
Even when the rules are defined, conflicting priorities and limited resources often require us to stress the rules to their limits. Regulations and procedures sometimes do not change fast enough to keep up with the needs of the fleet. Just as complex machines require lubricants to compensate for imperfections in design, ventures onto integrity’s slippery slope are a lubricant that keeps the gears of our organization turning in spite of bureaucratic flaws. Rather than avoiding the slippery slope or ignoring its existence, we must carefully navigate it to succeed.
These complexities ensure that black-and-white views of integrity do not last long on the deckplates. For an individual who has been taught that integrity is a simple matter that one either possesses or lacks, it is dangerously easy to dismiss that teaching as overly simplistic and out of touch, and to replace it with a cynical and relativistic ethical system. For someone ill-equipped to discriminate between various shades of gray, a two-color worldview can become supplanted by a one-color one, where right and wrong become meaningless and everything is gray.2 Complicating the matter is the phenomenon of “ethical numbing,” where repeated legitimate or perceived compromises of integrity can lead an individual to slowly adjust his or her benchmark for acceptable behavior.3
Integrity Is a Standard
A more accurate and durable model of integrity is to consider it a standard. Formality, cleanliness, administrative accuracy, and physical fitness are all examples of standards that must be defined, adopted, and maintained. A ship’s standards are largely a factor of what her leaders are willing to communicate and enforce. Individual crew members have their own standards, which naturally adjust to approximate those of their leaders. Integrity is no different. Every ship has cultural norms for acceptability in stretching the truth, and what may be a common practice on one ship would be unacceptable on another.
The challenge in adopting a nuanced view of integrity over the simplistic, black-and-white worldview is accepting that some degree of imperfection exists and is tolerable. No individual or ship has perfect integrity. This is a difficult thing to admit in an honor-centric organization; it is tantamount to an admission of compromise where we want to believe that we are uncompromising. It is, however, the reality and should be understood and managed. Rather than agonizing over the possibility that our systems and our people are flawed, we should be asking what we must do to improve.
At the same time, we cannot “downgrade” integrity to the same level of importance as the myriad other standards. Integrity is the foundation on which the whole organization is built, and where integrity is deficient the other standards become irrelevant. By treating integrity as a standard, we can grant it the degrees of nuance and complexity it deserves, in the interest of preserving it against the harsh reality of the deckplates. A binary view will invariably fail. The great thing about standards, by contrast, is that we can improve them.
Raising the Bar
Without deliberate and consistent attention, the natural trend of any standard is to decline. When we correct a problem that was previously ignored, the standard improves. When we choose not to address a problem, the standard falls. We never willfully should allow standards to decline, and consistently should strive for improvement. While we may not be able to fix the whole Navy, we can improve the standards of integrity within our immediate vicinity.
Inspect Thoroughly and Often. The purpose of inspection is not just to find problems, but also to communicate priorities. Every time a sailor performs a time-consuming task that nobody seems to notice or care about, a little voice says, “That was a waste of time.” Over time these voices can accumulate into a maddening chorus, urging even the most honest sailor to cut corners. When we take the time to inspect our sailors’ work, we provide an opposing view: “This is important, and it needs to be done right.” In the January 1981 Proceedings, then-Lieutenant Jim Stavridis explained, “A man will do surprising amounts of work based on the perception that he is ‘someone special’ doing a tremendously difficult job—but he will not do it or continue to do it unless he knows someone is aware of the job and appreciates it.”
To be effective, inspections should exceed the minimum requirements. If we look at only those areas mandated by some administrative program, we are not communicating a genuine personal interest. Implicit in the duties of a division officer is a regular presence in the spaces, observing maintenance, attending training, scrutinizing paperwork, and checking cleanliness and stowage. Inspections are more important in the long run than any administrative tasking, and inspections cannot be done from a stateroom.
When conducting inspections, it is important to enter with the assumption that our people are good and want to do what is right. This is the “trust” part of the classic adage “trust but verify.” Being realistic does not mean being a cynic, and an assumption of guilt is crushingly demoralizing to the honest sailor. Inspections should not communicate “gotcha.” They should communicate “this is what ‘right’ looks like.” Performance trends toward expectations, so we should expect integrity and then validate that assumption.
Learn How to Communicate “Can’t Do.” In an organization built on an ethos of mission accomplishment, the words “can’t do” are among the most difficult to say. They are also among the most important. They signal higher echelons that we have not been adequately resourced—in terms of manning, training, parts, maintenance or time—to accomplish the job we are assigned to do.
The difficulty with communicating “can’t do” in a mission-focused organization is fear that the inability to achieve equates to a personal leadership failure. Leaders face a concern that if some contemporaries manage to get the job done” given similar circumstances, their inability to achieve will reflect poorly on them. This creates a powerful incentive to bend rules, hide problems, or otherwise extend one’s position further down the slippery slope in the name of mission accomplishment. But the job can either be done safely and correctly without cutting corners, or it cannot. It is the leader’s moral imperative to communicate truth, regardless of the consequences.
For naval leaders, infused with the drive to achieve, “can’t do” may be even harder to hear than it is to say. The trouble is that sailors tend to place great pride in their work ethic and may be unwilling to state that a job cannot be done safely or correctly within the expected timeframe. The signal is likely to be buried under the common noise of complaining or pessimism, and an insufficiently discerning leader may dismiss it as only noise. Such a leader may believe he is motivating his people to achieve, when in fact he is indirectly asking them to compromise their own integrity to achieve the desired results.
Decline Plausible Deniability. Many a naïve ensign is led astray by the notion that there are some rocks best left unturned. Perhaps a job is not being done to a written standard, or a problem is being quietly addressed at an inappropriately low level. When the inquisitive division officer is told “you don’t want to know,” the implication is that some transgression has occurred, and if the division officer were to become aware of it, he would be obligated to intervene and correct it. By declining knowledge of the problem, he can decline the responsibility to act.
But responsibility cannot be declined or delegated. By allowing his subordinates to assume the moral burden of a secret, the willfully ignorant leader is effectively asking his subordinates not only to break the rules, but also to assume all the risks in doing so. If the secret were later revealed, the ignorant leader would be free of culpability in whatever transgression occurred, while his subordinates would be held responsible.
To be clear, it is necessary and expected that problems are handled at the lowest level possible, provided the chain of command remains informed as appropriate. The important thing for division officers to understand is that the words “you don’t want to know” are an alarm signal that merits investigation. While it is morally questionable at best to decide not to communicate a problem to higher echelons, it is morally repugnant to saddle one’s subordinates with the responsibility for that decision.
Integrity is not a simple matter of black and white. It is an enormously complex and dynamic human quality, shared across individuals and organizations, whose landscape is fraught with perilous gray areas and slippery slopes. These are the consequence of imperfect systems with limited resources and high demands. As leaders, it is our responsibility to navigate integrity’s gray areas and slippery slopes in the interest of our people as well as the mission we ask them to accomplish.
1. Ann E. Tenbrunsel and David M. Messick, “Ethical Fading: The Role of Self-Deception in Unethical Behavior,” Social Justice Research, vol. 17, no. 2 (June 2004), 223-36.
2. Eliezer Yudkowsky, “The Fallacy of the Gray,” LessWrong Blog, 7 January 2008, http://lesswrong.com/lw/mm/the_fallacy_of_gray/.
3. Albert Bandura, “Moral Disengagement in the Perpetuation of Inhumanities,” Personality and Social Psychology Review no. 3 (1999), 193–209.