Caught Off Guard
Following the July 2016 failed coup attempt in Turkey, the Sixth Fleet commander asked his staff several hard questions:
“Did we have any indications of a coup?”
“Did we see any anomalies that would have triggered watchstanders in our maritime operations center (MOC) to alert us of an impending crisis?”
“Did the naval attaché or any members of the country team in Turkey see indications of an impending coup?”
There were no clear answers. As in the 2008 war in Georgia, the 2011 Arab Spring, and the 2014 Russian annexation of Crimea, the U.S. Navy was caught flat-footed. This prompted an effort to determine how the Sixth Fleet staff could improve foresight of geopolitical events.
Navy education emphasizes the importance of incorporating political, military, economic, social, informational, and infrastructure (PMESII) data sets into our worldview. However, the Navy often encourages operations and intelligence personnel to think in reductionist terms, focusing on maritime activity and datasets. Integrating and synthesizing the broader facets of PMESII requires a mindset geared toward fusing incongruent data streams into meaningful and applicable information for Navy commanders.
The tendency to think in reductionist terms (i.e., only from the maritime perspective) is shortsighted in today’s precarious world, and the Navy must explore alternative solutions. Creating a culture that embraces high-velocity learning requires rapid, critical thinking throughout operational and intelligence assessment methodologies.
The assessment approach used in Navy MOCs today is outdated and ineffective at providing fleet commanders with a comprehensive, holistic understanding of the operational battlespace. This deficiency degrades the staff’s ability to predict and forecast events. Staffs at the operational level of war are overwhelmed and preoccupied with a huge amount of maritime-centric information focused primarily on the “here and now.” Navy staffs are unable to piece together disparate bits of information in a clear and concise manner. This erodes their ability to provide a fleet commander with valid, in-depth analysis of adversary intentions or predictive insights into impending crises.
One primary reason for this deficiency is the sharp separation between intelligence assessments (IAs) and operational assessments (OAs). This stovepiping deteriorates the commander’s ability to make wise, timely decisions because the operational-level analysis lacks depth and understanding of other strategic events and factors impacting naval operations. The physical and intellectual gap between IA and OA reduces the ability to appreciate an adversary’s capabilities and intentions and denies a fleet commander the prognostic analysis and comprehensive assessment framework necessary to outpace adversaries.
With Chinese naval activity in the South China Sea; a resurgent Russian maritime presence throughout the eastern Mediterranean, Black, and Baltic seas; and the expansion of violent extremist organizations (VEOs) throughout the Middle East and Africa, too much is at stake. The Chairman of the Joint Chiefs of Staff, Marine General Joseph Dunford, recently noted “the high likelihood that any future conflict will be transregional, multidomain, and multifunctional,” emphasizing the importance of assessments to the warfighting commander.1
The Naval Special Warfare Model
The dynamic nature of warfare does not afford a decision maker the necessary time to assess potential crises. Since 11 September 2001, successes against asymmetric enemies such as al Qaeda and the Islamic State (ISIS/ISIL) have shown the utility of combining intelligence and operational planners together in one team from the earliest inception of mission planning. The Naval Special Warfare (NSW) community is a good example where a symbiotic relationship between intelligence and operational analysts provides leaders with holistic assessments. NSW prides itself on nurturing tight links between intelligence and operations personnel, where everyone is involved intimately in all aspects of the planning process from inception (e.g., commander’s intent) to debrief and after-action report. The NSW construct provides the commander with one complete and comprehensive assessment that includes predictive analysis and foresight into enemy capabilities and intentions.
What makes the NSW assessment process more effective? From the outset, intelligence-operations planning teams are trained to analyze the entire battlespace external of the maritime dimension. The mission is viewed from a broader, non-DOD and interagency perspective that encapsulates mission impact on secondary/tertiary geopolitical events. For example, PMESII data continuously feed into NSW planning, especially during high-visibility operations to kill/capture terrorist leaders. NSW incorporates and dynamically updates PMESII at all levels of intelligence preparation of the battlespace (IPB) and captures broad elements of PMESII in a manner that provides valuable insights and lessons learned.
The conventional Navy may be well served to incorporate the unique and innovative traits inherent in NSW’s application of PMESII. One option for Navy consideration is to build and expand upon the fused NSW intel-ops model and embrace operations research systems analysts (ORSA) from the OA community.
The Whole Picture
“If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.” Sun Tzu
This age-old military maxim provides a keen insight into the benefits of an integrated assessment methodology that capitalizes on intelligence and operations processes. Understanding an enemy as well as oneself is achieved with timely, relevant, and collectible information derived from various PMESII-based sources to inform a commander’s decision matrix. All-source fused intelligence is foundational knowledge of the enemy. Integrating PMESII data from a multitude of traditional and nontraditional channels provides strategic- and operational-level context. Understanding the blue force weapons, tactics, strengths, logistics, weaknesses, and limitations adds the required comprehension of own force effectiveness and efficiency. Together all these elements ultimately provide a rich overall assessment.
Integrating IA and OA assessments within the Navy’s MOCs is a major undertaking. Cultural and bureaucratic inertia hinders the application of new approaches. The Navy’s current assessment culture has difficulty integrating OA and IA at the operational level. Separate assessments focusing on specific aspects of a problem set are extensions of Western reductionist philosophy, which condenses and compartmentalizes a difficult situation or crisis into component elements. Intelligence analysts are trained to study an issue or a problem as a sum of individual parts. The integration of those parts—what Naval Intelligence refers to as all-source analysis and fusion—is completed at large intelligence centers (e.g., joint intelligence-operations centers). This usually occurs geographically separated from the numbered fleet commanders and their operations research systems analysts. OA teaches quantitative and qualitative synthesis, which draws from not only non-maritime features of PMESII but also takes dissimilar elements of a problem set and attempts to draw actionable, evidence-based conclusions within an all-inclusive context.
OA and IA provide inputs into the commander’s decision-making process, yet intelligence assessments often weigh more heavily in the decision cycle than operational assessments. Why? OA typically occurs after the fact. Data to support the measures and indicators within an OA framework are redirected to the operational assessors after completion of tasks and missions. This latency affects the timeliness and the overall value of the assessment. IA, which is based on various intelligence feeds, can happen in near-real time—depending on the means and method of collection. Information to support OA should be integrated into the collections process to ensure that robust assessment of both blue and red force activities is occurring, and that real-time data influences operational-level decisions toward overarching objectives and desired effects.
Successful assessments require filtering out irrelevant information and capturing key indicators that inform the right measures, to assess progress toward the desired end-state. The commander’s end-state is defined in terms of objectives and effects. Effects nomenclature focuses the application on key words such as destroy, deny, and neutralize—with the subject of the effect statement being the adversary—and cooperate, or cease support, with the subject being non-friendly or neutral organizations and populations.2 Effects are conditions that the joint force commander or joint force maritime component commander must create to achieve objectives. It is the OA cell’s job to leverage PMESII information from multiple sources (including intelligence) to draw conclusions regarding the extent to which those effects and stated objectives are being achieved—especially if the effects are intended to create behavioral change within the enemy.
OA is defined as, “a continuous process that supports decision making by measuring the progress toward accomplishing a task, creating a condition, or achieving an objective.”3 As practiced, the planning, conduct, and implementation of OA is largely left to the operational assessments team. The commander trusts the cell to forecast, anticipate, and support assessment requirements. This activity often takes place away from and independent of the intelligence cycle.
Combining operational assessment science, operational art, and intelligence assessments will provide commanders better situational awareness of the battlespace. They will be better positioned to direct or support the effects of subordinate units and, simultaneously, support the strategic objectives of the geographic combatant commander.
The current stovepipe approach to assessments—partly a byproduct of the antiquated Napoleonic staff-code structure—hinders accountability, constricts the assessment process, and degrades the commander’s ability to receive timely, proactive, and predictive analysis of the adversary. “Stovepipe” implies an organizational structure that inhibits cross-coordination amongst the various staff sections. Navy leaders are attempting to overcome this organizational barrier by creating cross-functional teams (CFTs) within the MOCs. This horizontal, collaborative structure should foster better assessments by enabling staff sections to overcome barriers, better understand other staff activities, and provide a forum to self-assess and report on their own actions.
Intelligence requires some separation between staff elements to protect highly classified sources and methods. Nonetheless, the cross-functional nature of an OA cell comprised of personnel with appropriate security clearances and “need-to-know” simplifies cross-pollination between intelligence and operational assessments. During exercise Juniper Cobra 2016 on board the Sixth Fleet flagship, USS Mount Whitney (LCC-20), the operational assessment data collection matrix was cross-referenced and underwent revision in coordination with the joint intelligence collection plan. The collaboration between intelligence analysts in the maritime intelligence center and the OA team resulted in an integrated collection plan, which reduced duplication of effort and led to a more effective application of scarce collection assets.
Time for Change
A few weeks after the failed coup attempt in Turkey, Sixth Fleet hosted a conference for U.S. naval attachés and Offices of Defense Cooperation in Naples, Italy. Attendees were briefed on the application of various forecasting models prevalent across academia and the private sector. One model presented was Wharton Professor Philip Tetlock’s landmark 2005 study on forecasting, called the Good Judgment Project. Tetlock’s study involved tens of thousands of ordinary people used to forecast global events. In some cases, the masses of non-experts outperformed the collective judgment of intelligence analysts with access to classified information. Tetlock identified a pattern of success that involved gathering evidence from a variety of sources, thinking probabilistically, working in teams, and being willing to admit error and change course. His study outlined the first demonstrably effective way to improve the ability to predict the future—whether in business, finance, politics, or international affairs.4
How can a Navy numbered fleet staff successfully forecast significant geopolitical events? Tetlock’s study shows the key to effective assessments is access to an assortment of source information—not just maritime or classified information—that shows the importance of PMESII. Integrated assessments, fueled by information across the spectrum of PMESII, support the commander’s decision cycle.
At the operational level, a collective shift in mindset requires critical thinking to be applied to our “daily grind,” forcing intelligence and operations analysts to elevate above “the noise” of raw data and unevaluated maritime information, and stitch together various sources of information as was done in the Good Judgment Project.
Navy academic institutions, school houses, and centers of excellence need to establish the foundations for intelligence and operations personnel to cultivate linkages and networks outside of orthodox DOD channels. In other words, the Navy must advocate critical thinking at a broader, macro-level viewpoint that promotes the expansion of relationships away from exclusively traditional maritime sources to include global, diplomatic, and financial channels of information.
OA provides ways and means from which to derive an integrated assessment through collaboration during the planning process. Commanders should define their vision for fused assessments and direct their staffs in achieving it. Collaboration and self-awareness can then foster assessments of both enemy and friendly forces through shared data and assessment processes.
By combining IA and OA assessments, would the fleet MOCs be able to identify indicators or anomalies to forecast major events like the failed coup attempt in Turkey or the Arab Spring? Currently there is no clear answer, but the status quo is unacceptable. Navy leaders must demand closer cooperation between these two assessment communities or continue to be caught off guard and surprised by crises.
Leveraging the information generated by the intelligence community and integrating it with OA is the next step needed to improve the Navy’s performance at the operational level of war.
Restructuring the Navy’s assessment framework requires a cultural shift. Assessment practitioners should understand and evaluate Navy-centric problems through the broader lens of PMESII, which encourages synthesis and balances quantitative and qualitative systems analysis into the commander’s overall decision cycle. This method will unite the commander’s understanding of enemy capability, intent, and operations; friendly forces’ status; and risk assumptions. A combined IA/OA approach will improve analysis and foresight by enhancing the commander’s ability to view the battlespace at a higher resolution.
1. Jim Garamone, “Dunford Discusses Challenges to the Joint Force, Need for Defense Reform,” DoD News, Washington, DC, 29 March 2016.
2. Commander’s Handbook for an Effects-Based Approach to Joint Operations, Joint Warfighting Center, 24 January 2006.
3. Ibid.
4. Philip E. Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction (New York: Broadway Books, 2015).
Commander Moghaddam is a career naval intelligence officer who served as the director for assessments at U.S. Naval Forces Europe/Africa and U.S. Sixth Fleet in Naples, Italy from 2015 to 2017. He is now the strategy advisor to the commander’s action group for U.S. Naval Forces Europe/Africa and Allied Joint Force Command in Naples, Italy.
Mr. Schoch has a decade of experience integrating intelligence and operations data to support assessments. During his career as a Navy civilian, he has served at the Naval Surface Warfare Center, Corona Division; U.S. Naval Forces Central Command; Naval Special Warfare Unit Ten; and Commander Carrier Strike Group Twelve. He holds a master’s degree in National Security Studies from California State University.