In an era of great power competition, the Navy must not fall prey to an “amiable illusion” that its superior technology, inherent courage, or proud traditions are sufficient to ensure success in the coming fight.1 Potential adversaries also have advanced technology, brave sailors, and lauded traditions. The Navy’s enduring competitive advantage is the quality and effectiveness of its officers and sailors—an advantage that must be maximized through consistent, systematic, and high-quality training. As technology improves to help leaders diagnose, recognize, and adapt training regimens to correct or improve behavior at an increasingly higher resolution, the Navy must leverage such tools or risk being left in the competitive dust of a more foresighted adversary.
An unusual pairing of competitive domains illuminates both the pressing need for the Navy to adopt advanced personnel performance tracking, documentation, and analytical tools and the feasibility of doing so. Technology and best practices used by world-class athletes, along with elite video game champions, chart the way ahead for an evolution in naval warfighting training—and increased fleet lethality.
On the Field
With the influx of “geeks” into major league baseball front offices following the success of manager Billy Bean’s Oakland As (chronicled in the film Moneyball), advanced analytics of player performance have increasingly permeated all competitive sports leagues. Searching for even a minuscule potential competitive advantage, teams of analysts sift through terabytes of data collected by ubiquitous sensors. The analysis of this data influences roster selections, strength-training regimens, and game plans. Each team desperately seeks to put the odds, however slim, in their favor.
The principle is simple. Instead of relying solely on a human coach to observe, track, and analyze a player’s performance, teams are supplementing human observation with precise digital performance information. Working with commercial providers such as the Australian firm Catapult, they monitor players with a sophisticated, wearable device equipped with three-dimensional accelerometers, magnetometers, and gyroscopes. This provides each player’s precise position, velocity, acceleration, altitude, and other data, which are broadcast in real time to a computer with software to aggregate, analyze, and present useful information to the coaching staff.
For example, college football powerhouses routinely collect millions of lines of data during each practice that is then used to adjust game plans or starting lineups.2 Topwatch or the subjective personal assessment of a player’s readiness (“I’m still 90 percent, coach, put me in!”) are coupled with objective data on player performance to improve decision-making. An early proponent of this technology, Florida State University, secured a college football national championship in part due to the competitive advantage this information provided.
Consider a typical strength-training routine. A coach prescribes a series of exercises with weights, sets, and repetitions. Progress is tracked throughout the season, as the weight lifted hopefully increases as strength improves. However, except for Olympic or power lifters, raw lifting ability is not directly translatable to athletic performance. For many sports, power defined as:
power = (mass x distance)/time
is a more useful metric. Thus, the critical factor is not only the weight lifted, but also how quickly an athlete moves the weight across the distance traveled in an exercise such as the barbell squat. It is difficult, however, to calculate a barbell’s velocity with the human eye and a stopwatch—a contributing factor to the preference for maximum weight-lifting ability as the standard. But by using modern technology, elite programs now monitor barbell speed and hold their athletes to previously impossible-to-track standards in hopes of improving their on-field performance.3
In the Basement
In a less brawny form of competition, elite video gamers use advanced training technology to improve their gameplay, seeking victory and exorbitant cash prizes. To gain an edge, they supplement their exhaustive training regimens with additional sensors and advanced analytical tools. Instead of improving power generation, running velocity, or lateral acceleration, competitive gamers seek to enhance their ability to direct their forces and focus their attention—skills similar to those needed in a modern naval combat information center (CIC).
Consider the game StarCraft II, a space-themed real-time strategy game in which players resource, build, and control a futuristic army with the objective of destroying a competitor’s base. The quantitative measure of a player’s physical ability to input directions to their computer is actions per minute (APM), which the game records based on detected mouse clicks and keyboard presses. The best players control their armies at blazing speeds of more than 600 APM, whereas a novice might play with a speed of 40 to 60 APM.4 Beyond simply playing faster, the best players must make the best use of their armies, combining split-second tactical and operational decisions to best their opponents.
To support players improving their gameplay, most competitive games have post-game analytical replay tools. Players can view detailed graphs, charts, and analysis of what they did at each point in the game—where their APM fell off or improved, indicating where effort may be adjusted to potential advantage. Fan sites offer additional tools and methods to accompany in-game analysis, resulting in widely practiced strategies rapidly spreading throughout the game’s online community.
Another crucial aspect of gamers’ performance is where they focus their attention. Cameras designed to monitor eye movement and focus, such as the Tobii Eye Tracker, capture a player’s gaze on a semitransparent bubble overlaid on the normal game field. Post-game replays include the bubble overlay to help diagnose the effectiveness of the player’s attention and scan rate.
Both the athletic and video game tracking and analytical tools are mature, affordable technologies that have been used to great effect. The Navy must implement similar devices to better train its fighting forces.
Naval Training: Driving Toward Excellence
The revised Surface Force Readiness and Training Manual (SFRTM) improves on the previous iteration, but the certification requirements still include a battery of prescribed minimum standards. Although high-performing ships are able to complete the gauntlet of basic phase training ahead of schedule, simply meeting an unchanged checklist of minimum standards sooner in a training cycle does not equate to an absolute increase in warfighting readiness. Rigorous performance tracking and analysis technology would be a game changer.
For example, afloat damage control teams must apply shoring to simulated structural damage as part of the certification event for mobility damage control. The certifying grade sheet lists standards such as, “Was need for shoring identified?” “Was shoring constructed correctly?” “Was status of shoring reported?” “Was shoring watch set?”—along with an associated point value for accomplishing the task. As long as the damage control parties safely complete the shoring process as indicated on the grade sheet, they will meet certification standards.
The glaring omission in these standards is how long this process should take. Per current certification standards, a ship that can effectively apply shoring in half the time as another is awarded the same number of points.
An “excellence” standard should be based either on computer modeling of expected damage by a threat weapon or real-world casualties such as the bombing of the USS Cole (DDG-67). To continue the shoring example, assume that a Cole bulkhead experienced x lbs of pressure that increased with time and would have failed within y minutes had her watch teams not responded appropriately. Assume a properly applied “I-type” shoring provides z lbs of counterpressure, providing enough support to prevent the failure of the bulkhead. Thus, a realistic standard would require that the damage control team identify the need for and apply the requisite shoring in <y time. This time standard could then be trained to and improved on. Times could be broadcast throughout the fleet, with sailors able to share tactics with one another on how best to accomplish a task.
A similar argument can be made for CIC watchstanders completing tactical certifications. Instead of simply correctly simulating an engagement of a hostile surface combatant with surface-to-surface missiles, analysis could determine how long in a Tactical Situation I environment a ship could expect an adversary to take to complete its F2T2EA (find, fix, track, target, engage, and assess) process, say x minutes.5 The Navy has it on great authority it must “fire effectively first,” so watchstanders ought to train to the standard of <x minutes to ensure success in action. To reach that standard, individual watchstander inputs and gaze could be tracked to identify potential areas for improvement. Margins of victory will be razor thin—each second may count in battle. Automatic performance tracking tools can facilitate these precise measurements and make holding sailors and ships to such standards practicable.
Fielding Performance Tech Afloat
The individual movement sensors and network of data analysis computers could be implemented afloat at low cost. A typical contract for a major college football team costs in the five to six figure range, barely enough to register on a Pentagon financial analyst’s spreadsheet.6 Type commanders could solicit interest and fund a pilot program on a ship preparing for predeployment certification.
On the other hand, implementing enhanced user input analyzers and gaze trackers at tactical CIC watch stations could be troublesome because of integration issues and the need to maintain the integrity of the Aegis combat system. However, Aegis Virtual Twin potentially could provide this capability.7 The Virtual Twin replicates a ship’s combat system computer code in parallel with the real-world combat system, with significantly less hardware and space than required by the traditional architecture. Enhanced human input analyzers and gaze trackers could be implemented within this concept, allowing sailors to train on the Virtual Twin while maintaining the full operational uptime of the real-world Aegis system.
If this is impracticable, public naval warfare simulators, such as Matrix Gaming’s Command: Modern Air and Naval Operations running on a standalone laptop, could be coupled with the input and gaze analyzers to provide essentially the same training stimuli to most CIC watchstanders at marginal cost. A sufficiently powerful laptop, game license, and input and gaze analyzers could be fielded as a unit for less than $1,000.
Initially, the ship’s crew would be responsible for ensuring that watchstanders wear or use performance trackers during training evolutions, although they also should be able to view and manipulate the performance data. Available systems automatically broadcast and upload data to an online server for aggregation and analysis. A cross-functional team including human factors specialists, operations researchers, and data analysts from organizations such as the Naval Postgraduate School, OpNav N81, and the Center for Naval Analyses could analyze the data and provide feedback and insight to shipboard leaders.
Once predeployment requirements are complete, data from the training cycle could be used as a standard for subsequent repetitive exercises that ships must complete to maintain their training currency. A meaningful measure of performance improvement during the course of deployment would be available at the individual, team, and ship levels. Last, the ship could monitor its overall progress and objectively know that its warfighting readiness improved throughout the deployment—and celebrate that accomplishment.
The Navy is nearing a point where it cannot reliably depend on superior technology, increasing the quantity of training, or outbuilding adversaries for battle success.8 Its only enduring competitive advantage is the quality of its officers and sailors. Enhanced performance tracking tools can help maximize this edge.
The consequences of inaction will not be as trivial as the inability to move a leather ball down the field or to defeat a competitor’s virtual avatar in futuristic space combat—real people will suffer real harm because they were not held or trained to realistic, analytics-based standards of excellence. With modest investments and the cultural willingness to adapt training methods to the 21st century, the Navy can become an even more lethal, effective, and competitive fighting force.
1. Patrick O’Brian, The Ionian Mission (New York: W. W. Norton & Company, 1992), 78.
2. Marc Tracy, “Technology Used to Track Players’ Steps Now Charts Their Sleep, Too,” The New York Times, 22 September 2017.
3. Andy Staples, “Technology in College Football: Using Brains to Build More Brawn,” SI.com
4. Kevin Wong, “StarCraft 2 and the Quest for the Highest APM,” Engadget, 14 July 2016.
5. CAPT Wayne P. Hughes Jr. and RADM Robert P. Girrier, USN (Ret.), Fleet Tactics and Naval Operations, 3rd edition (Annapolis, MD: Naval Institute Press, 2018).
6. Rainer Sabin, “Inside the Technology Giving Alabama a Competitive Edge,” AL.com, 2 July 2017.
7. Megan Eckstein, “Artificial Intelligence Could Speed Up Navy Training as New Tech Is Rapidly Fielded,” USNI News, 21 February 2019.
8. David Axe, “The U.S. Navy Won’t Like China’s New Ship-Killer Hypersonic Missile,” The National Interest, 7 November 2018; David B. Larter, “U.S. Navy Worked around Its Own Standards to Keep Ships Underway: Sources,” Defense News, 7 September 2017; Dean Cheng, “China’s Pivot to the Sea: The Modernizing PLA Navy,” The Heritage Foundation.