A common trait among surface warriors (SWOs) is that we tend to learn (or stumble into) the same lessons over and over again. Occasionally a sanitized report will come from the Naval Safety Center on an actual collision or grounding, and these are useful to be sure, but it is no secret that for every class A mishap there are several smaller close calls that, if captured, could provide lessons to help the rest of us prevent them.
According to a Government Accounting Office study—referenced in a message from the Commander, U.S. Fleet Forces Command—more than 95 percent of lower level mishaps go unreported.1 Why is that? I submit that the surface warfare community—unlike the aviation and submarine communities—has a culture where lessons are kept within the lifelines whenever possible. Surface warriors do not like to admit mistakes or shortcomings, much less make formal reports of mishaps or damage that fall below the reportable threshold.
Signs of Progress
While some communities are better than others, the U.S. Navy often unwittingly promotes a lack of openness when it comes to lessons learned. Years ago while serving on a major exercise staff, I observed the aftermath of two ships colliding along their sterns during a small-boat transfer. One ship was British, the other U.S. Navy. The next morning, when I briefed the Royal Navy admiral in charge of the task force, he was shocked to learn that the U.S. Navy ship was headed into port for an investigation, while the British vessel was on station ready for the day’s training. I explained that there had been a minor collision, and that in any such matter it was normal that the training be stopped and an investigation held. His reply was, “It’s a bump. If you drive warships, you bump into things occasionally—you learn from it and move on.” He then confided that he had suffered a collision as a first-tour commanding officer, resulting in his receipt of a letter from the First Sea Lord that began, “You became a better naval officer today.”
In the U.S. Navy, we usually learn about incidents through the relief of a commanding officer through Sailor Bob or Navy Times, and beyond the perfunctory “loss of confidence” statement, the reasoning is rarely made public. It is unfortunate that the underlying issues often are lost to posterity. If we openly shared the lessons and root causes from smaller events along with a redacted version of what led to the larger incidents, perhaps there would be fewer of them.
The surface warfare community has shown some encouraging signs of progress. The Surface Warfare Officer School (SWOS) seamanship team, for example, recreated the USS Porter (DDG-78) collision in the simulator so every prospective department head could watch and learn. Captain Steve Coughlin, in his excellent January 2015 Proceedings article “Overconfidence Can Be Crushing,” covered the leadership aspects of that important event.2 Lessons learned also formally are evaluated and shared within the community in the maintenance arena, where Carrier and Surface Team One hold formal lessons-learned conferences each quarter to discuss surface ship maintenance issues. The results are published and consolidated on a shared website. The Naval Surface and Mine Warfare Development Center also captures and shares tactical lessons—the recent missile defense of the USS Mason (DDG-87) off Yemen comes to mind—with the community. Broader application of ComNavSurfLant instruction 3040.1—which directs the reporting of near misses—provides a good foundation to build upon these successes.3
These are bright spots, but culture change takes a generation. Now would be a great time to start.
Identify the Root Cause
Learning from a close call or mishap begins and ends with a genuine desire to identify and correct the root cause. For a skill so fundamental, there are few analytical tools available to lead one to this “holy grail.” Ask any senior naval officer if he knows how to determine the root cause, and he will say yes. Ask him how he does it, and he will say something like, “It’s too obvious to explain,” which means “I don’t know.” The method I have found most helpful is called the “Five Why” analysis. You start with the question “Why did this happen?” and for each answer to that question, you again ask, “Why did THAT happen?” The theory is that when you have driven the “Why” analysis five deep, you have likely arrived at the true root cause. Only then can some corrective action be taken.
In the aviation community, the hazardous report (HazRep) process encourages squadrons to investigate possibly hazardous practices or conditions and share them with all concerned parties. In the nuclear community, Admiral Hyman G. Rickover established a system of incident reporting whereby events that could have resulted in damage or injury are investigated by the ship, a root cause analysis is performed, and the results are shared with the entire nuclear fleet. In both these communities, formal training is required on events that occur on other ships or squadrons, with the goal of learning from each other’s mistakes. The expectation is that aggressive action to address less significant issues can mitigate or eliminate the factors that could lead to larger ones. Perhaps there is a place for such a program in the surface Navy today.
During my commander command tour, following an injury to a sailor in a boat accident, a Safety Investigation Board was convened. The surface warfare officers (this one included) were a bit leery of this process, having connected the dots in the past between “Scrubbed Safety Mishap” reports—which are nonpunitive by nature and do not mention ships or names—and punitive action tied to the same event.
When I consulted the air detachment officer in charge (OIC), he informed me that he had been through several such events in his career and that the goal always was to get to the root cause and correct it in the interest of safety. In his eyes, there was only good to be had from the process. In this case, we discovered many deficiencies both in procedural compliance and human factors, such as fatigue and task saturation. By freely discussing the consequences of our mistakes, we were able to learn from them and prevent a reoccurrence.
Appropriate Accountability
Accountability is a basic tenet of our profession. Are we, as naval leaders, not to be held accountable for our decisions? Again, ask a junior naval officer what it means to be held accountable, and he quickly will answer, “To get fired.” When it comes to holding someone appropriately accountable, there must be a difference between the treatment of malfeasance and omission, between negligence and a well-intentioned mistake.
Actions speak louder than words. A review of existing processes yields several from other communities that could be leveraged. A few possibilities include:
Aviation. In the aviation community, junior officers are tasked with authoring at least one safety article per deployment for submission to Approach magazine. In this non-attribution format, “near-miss” events are described in a very personal manner—by the author—with lessons that he or she has learned.
Nuclear Power. Nuclear operators have a formal process for sharing near-miss information, with specific criteria, in Naval Reactors Technical Bulletins and incident reports. These are nonattribution and are required reading for every nuclear operator.
Naval Shipyards. Similar to the process in a nuclear ship or submarine, the naval shipyards conduct a formal critique after a significant event. These follow a script and produce an analysis of causal factors and corrective actions.
Navy Safety Center. There is a wealth of information on the Navy Safety Center’s web page, although it is difficult to find and not tailored toward operational root cause analysis.
Operational Risk Management (ORM). We include ORM in our briefs for major evolutions, but in some cases this becomes an administrative “check in the block” instead of a true discussion with real decisions being made based on the analysis. A repository of near misses would add a dose of reality to the ORM process.
Afloat Culture Workshop. A team of Navy Reserve 06s visits ships and squadrons and conducts a detailed interview process to determine if a command has a culture of safety and procedural compliance, among other things. This superb process can provide insights, although the results stay with the command.
Commercial Industry. Several commercial industries have processes in place to share information with professional peers and inside the organization. The Federal Aviation Administration, National Transportation Safety Board, and Nuclear Regulatory Commission all have experience that could be leveraged.
Surface Warfare. Surprise! There are several possibilities already resident in the existing surface warfare sphere. Surface Warfare magazine, generally a showcase for good-news stories and professional discussions about programs and policy, could be revamped to mirror Approach magazine. SWOS, Navy Leadership School, and Afloat Training Groups all offer superb venues to host discussions—and perhaps provide some level of consolidation and technical review—but they have not been centrally or formally tasked to do so. Each of these organizations has robust web pages with “toolboxes” that could be tailored for meaningful lessons-learned discussion.
Principled, Not Pejorative, Learning
The fact that these venues all exist, yet there is no robust discussion of near-miss data in surface wardrooms, speaks to our culture. It is second nature for an aviator or a nuke to share his or her latest mistakes with the community to prevent peers from making the same errors and perhaps causing damage or injury. As SWOs, the tendency is to debrief internally within our crews but never to let it slip out to our peers or bosses that we made a mistake. This lack of a lessons-learned culture has an important secondary effect on our community.
Our junior officers may think, “Wow, this community cannot tolerate a mistake.” Although my personal experience—unfortunately there is not enough space here to list all my mistakes—clearly would belie this conclusion, the perception has serious negative implications for retention and morale, as often is reflected in surveys. But it does not have to be this way. Accepting risks and seeking to learn from our mistakes would go a long way toward making the surface warfare community more attractive to young officers.
Creating a learning culture requires the commitment and involvement of all members of the surface warfare community, from the most senior commanding officers to the most junior watchstanders. Anything less will result in a paper program with no depth and no value. A 2014 Navy Retention Survey found that 68.7 percent of respondents believe positions of senior leadership, speci?cally operational command, are less desirable because of increasing risk aversion.4 How sad to see superb young officers “jumping ship” because they do not trust the community to stand behind them if they falter. Change can only occur when leaders at all levels not only talk about learning but also show by their actions that they are willing to share their own mistakes and learn from others—without attribution or pejorative behavior.
Fewer Mishaps, Stronger Professionals
In the end, none of the tools mentioned here will change a culture; change must come from within the community. When I began teaching young officers at SWOS after retirement, I quickly learned that key points are often best driven home by a good sea story, but these are anecdotes. What is required is a process—and someone to take ownership of it—to focus the talent in organizations that already exist. The format could be taken from the nuclear “Incident Report” form and posted under various operational categories for a designated set of experts to review. A strong process would combine formal training on root-cause analysis—a mix of science and art—and the ability to share lessons in some readable and accessible forum (magazine, webpage, etc.) that is incorporated in all levels of the SWO pipeline. Finally, critical to any chance of success is that whatever process evolves, it must be owned and executed by members of the surface warfare community. Manual of the Judge Advocate General (JAGMAN) investigations and mishap reports, generated by lawyers or safety experts, have an important place in the exploration of causes and corrective action, but these do not generate the necessary conversation or carry the gravitas that would accompany the product of credible operational experts.
During my last deployment, I shared the story of a close call during underway replenishment with my operational commander. His reply was encouraging: “This type of openness is of great benefit to making all us surface warriors better. I don’t know anyone who has not had similar experiences of one sort or another.” Fostering openness and a willingness to share mistakes, root causes, and corrective actions with less fear of backlash would be a big step toward creating a positive surface warfare culture to strengthen our profession, prevent serious mishaps, and save lives.