Mishap rates in the Navy’s surface fleet are trending downward, but unified data collection and analysis can help move safety to the next level.
Operating forward in dangerous waters has been the mission of the U.S. Navy since its founding in 1775. At sea, two types of dangers threaten naval forces: risks incurred as specific to the mission, to include enemy action but also those incurred in the course of humanitarian missions; and the risks inherent in operating at sea, even during peacetime. Just as accidents can happen during routine driving, ships can experience mishaps during routine at-sea operations. Mishaps are costly in terms of lives lost, injuries, and dollars expended in ship repairs, but they also often have far-reaching secondary effects, affecting readiness across the fleet, deployment schedules, and attitudes toward risk.
The overall mishap trend within the U.S. Navy’s surface force actually has been downward since the mid-1980s, in large part because of the fleet’s efforts to improve operational readiness and safety. Nonetheless, 2017 was a bad year.
As the “Strategic Readiness Review” asserts, many of the lessons learned in the subsequent investigations already had been documented on numerous previous occasions, yet the problems continue to occur.1 Absent a unified data collection and analysis strategy, efforts to improve safety have reached their limit. The next steps toward a safer Navy will require the coupling of quantitative data with the qualitative experience of senior leaders. The surface force needs to adopt data collection and analysis procedures similar to, and in some cases surpassing, those employed by the aviation and submarine forces, as well as by industry.
The Mishap Dataset
Quantitative analysis requires data. The Naval Safety Center has a comprehensive data set of mishaps at sea involving the surface fleet, and public data from the Naval History and Heritage Command shows ship levels by year. There is no single, consolidated, publicly available source for the time ships spend sailing at sea, but some general information can be gleaned from the amounts of fuel purchased, as shown in the Navy’s budget documents.2 The need to reconcile disparate data sets to determine the ships’ histories before mishaps is an area for improvement.
The mishap data from the Naval Safety Center consists of 1,373 records representing 1,072 mishaps spanning 21 January 1970 through 29 March 2016.3 Two notable mishaps from 2017—involving the USS John S. McCain (DDG-56) and USS Fitzgerald (DDG-62)—were added based on publicly available information.4
The first notable feature in this data set is not the number of mishaps in recent years, but rather the sharp downward trend in mishaps since their peak in the mid-1980s. (See Figure 1.) The Navy’s trends in serious incidents, as measured by these plots, is decreasing, although there are still years where there are outliers.
The second thing of note, although it is harder to see, is that 2017 was indeed a bad year for the surface fleet. The question this raises—which is a leadership question as much as a statistic one—is: Is the badness of 2017 an isolated “blip” in an overall downward trend, or is it an inflection point where mishaps begin to increase?
Fleet Size and Steaming Days
The next elements to consider are the surface fleet’s size and operating profiles. (See Figure 2.) Data on the operating profile of the U.S. Navy was compiled from budget documents. Fuel purchase (in barrels) is used as a proxy for steaming days. Fuel purchase data has the advantage of being useful without correcting for inflation or needing to determine the cost of fuel in a number of locations. It has the disadvantage of not recognizing the differences in fuel consumption across classes of ships and, by definition, does not include nuclear-powered aircraft carriers.
This section is equally notable for what information is not present: for example, the fleet’s operations are not broken out into training versus operational days. In addition, there is no way to know the career histories of key watchstanders, such as the officer of the deck, conning officer, and even the helmsman, prior to mishaps. While not part of the current surface warfare culture, this data is kept in detail by aviators, and it perhaps should be part of the requirement for the future of safety.
Mishaps and Steaming Days
Although the timelines for this analysis are admittedly short, there does not appear to be a strong causal relationship between the Navy’s fuel procurement (as a proxy for steaming days) and mishap rates. (See Figure 3.) Various possible causes for the recent mishaps have been offered—from the gutting of Surface Warfare Officers School, to sleep deprivation, to XO/CO Fleet-Up. The data, however, does not support the conclusion that any of these factors have increased the surface fleet’s mishap rate.
So instead of looking backward at what the Navy has done, it may be more useful to take a forward-looking approach to see what can be done given the constraints of time, cost, and operational schedules to improve safety.
Making the Next Breakthrough
To drive down the mishap rate further, the surface fleet needs better data to gain insight into the factors that result in mishaps and near mishaps. While many top-level leaders in both the Navy and government are excited about the prospects of artificial intelligence and machine learning, these tools are only as good as the data that feeds them. No algorithm can make up for missing data. Simply put, if analysts know more about the ships that had mishaps, they would know better how to prevent future occurrences.
As the “Strategic Review” recommends, the Navy must become a true learning organization:
Navy history is replete with reports and investigations that contain like findings regarding past collisions, groundings, and other operational incidents. The repeated recommendations and calls for change belie the belief that the Navy always learns from its mistakes. Navy leadership at all levels must foster a culture of learning and create the structures and processes that fully embrace this commitment.
The Navy has arrived at an inflection point where:
• The underlying relationships among causal factors in mishaps have reached a level of complexity that is not penetrated by the Navy’s current tools.
• Improving safety at sea is an ideal test bed for learning how to use and implement advanced analytics.
It is impossible to eliminate all risk of mishaps (short of mooring the fleet to the pier), but improved data collection and analysis is the next step toward improving safety. The surface fleet can take—and improve on—practices from the aviation and submarine communities, as well as leverage government, nongovernment, and commercial organizations that possess significant expertise in pre- and post-mishap factor analysis to define the data that would be most relevant to collect.
Data Collection Supports Safety at Sea
The data in the Safety Center’s database is focused on physical systems and the environment, i.e., ship class, date, weather, sea state, etc. There is little information readily available on the human element—such as the qualification of the watchstanders, their time on board, time in rank/rate, and their nutrition and sleep patterns over the 72 hours prior to a mishap.
Of course, each additional data requirement takes manpower. Collecting data is a “manpower tax” on an already taxed system, and the surest way to failure will be to levy another requirement on the surface force—particularly if the hours spent collecting data are taken from a sailor’s sleep time or if sailors do not see the collection effort as valuable. In short, the surface fleet should not just collect more data, but rather it should make a targeted effort to collect more useful data.
This collection effort might be a good place to begin implementing some of the recommendations from the Naval Research Advisory Council (NRAC).5 The NRAC report cites the need for greater data, autonomy, machine learning, and deep learning. The focus of the report is combat efficiency, but the Navy could gain institutional experience and understanding of these techniques either before or in concert with implementing them on combat systems. There will be growing pains, and artificial intelligence and machine learning will not solve every problem.
By harnessing the experience and expertise that other organizations have developed, the Navy can begin to identify the factors/variables it should be observing and the data it needs to collect from warships and sailors. Analysts then can move forward to capture that data in a way that minimizes the manpower/time tax on crews while offering them the tools to better understand how a variety of factors increase or decrease risk.
Watch Bill Creation: An Example
Consider, for example, shipboard watch bill assignments. Currently, the senior watch officer and senior enlisted watch bill coordinator work to craft a basic underway watch bill, which the senior watch officer then (manually) evaluates based on his or her experience. The watch bill then is reviewed by the executive officer and commanding officer, who apply their professional experience. In practice, everyone is relying on their accumulated experience and memory. This does not provide the commanding officer and his or her subordinates with tools to fully understand the level of proficiency/experience/performance of those sailors on the watch bill and how that might impact risk.
Imagine if the surface navy took a note from the aviators and developed a user-friendly expert system that maintains the watchstander electronic record jacket for every sailor. The system would track the number of watches stood and the nature of those watches (daytime, nighttime, special evolution, etc.), as well as the watchstander’s level of proficiency/experience and how much scheduled rest the person had per cycle. When preparing a watch bill, the senior watch officer would use this system to develop an optimized solution, balancing risk and policy. The senior watch officer then would then apply his or her experience and knowledge to evaluate the watch bill, and the commanding officer would use his or her accumulated operational experience to finalize it.
This watch bill would have the benefit of an expert system’s ability to rapidly process hundreds of watchstander records. The critical, irreplaceable human experience and judgment would be the final step, applying quality control for intangible factors. Done right, this watch bill would be superior to one developed using the current process. Moreover, it would enhance post-mishap (or post-near-mishap) analysis by providing a deeper and richer set of causal data to support comprehensive safety and lessons learned inquiries.
While leadership has been instrumental in improving the surface fleet’s safety record, further improvements will require the addition of advanced techniques. The final question for operations research professionals is “How will we know we were successful?” We propose two metrics:
• That the mishap rate continues to decrease to the point where the Navy has periods spanning multiple years without a serious mishap.
• That data-driven analytics supports leaders making decisions to improve safety, and this mind-set spills over into other areas of the surface fleet, to include operations and planning.
It should be against this measure that the effectiveness of data-centric safety improvements should be evaluated. The time to start moving in this direction is now.
Authors’ Note: All things in data analysis move quickly, and safety is no exception. As this article was going to press, the Naval Safety Center announced the stand-up of a Data Analytics Office (https://news.usni.org/2018/06/13/naval-safety-center-standing-data-analytics-office-amid-surface-aviation-mishap-increases). We are hopeful this will help the surface Navy—and the entire force—reduce mishaps by squeezing as much learning as possible from events at sea, and we are eager to see how this helps the Navy reduce mishaps in the future.
1. ADM Gary Roughead, USN (Ret.), and Michael Bayer, “Strategic Readiness Review,” U.S. Navy (December 2017).
3. The mismatch in numbers is because when two ships collide, both report the mishap.
4. Department of the Navy, “Memorandum for Distribution,” 1 November 2017, https://news.usni.org/2017/11/01/uss-fitzgerald-uss-john-s-mccain-collision-report.
5. Naval Research Advisory Committee, “Autonomous and Unmanned Systems in the Department of the Navy,” September 2017.
⎯ Captain Calfee, a surface warfare officer, currently is the Federal Executive Fellowship program U.S. Navy Fellow to the Center for Strategic and Budgetary Assessments, a nonpartisan defense policy analysis think tank. He previously served as commanding officer of the USS McCampbell (DDG-85), homeported and forward deployed in Yokosuka, Japan.
⎯ Commander Schramm is a senior fellow at the Center for Strategic and Budgetary Assessments in Washington, DC. A former naval helicopter pilot, he now works as a statistician and is at the intersection of data, mathematics, and policy.