In 2014, the RAND National Defense Research Institute published Data_Flood: Helping the Navy Address the Rising Tide of Sensor Information, a report focusing on the overwhelming amount of data an intelligence analyst must filter through as a result of the ever-increasing number of intelligence, surveillance, and reconnaissance sensors. “Common wisdom among analysts is that they spend 80 percent of their time looking for the right data and only 20 percent of their time looking at the right data,” the report notes.1 In 2020, speed and scope of data collection has never been greater, and the insatiable demand for accurate, timely intelligence analysis persists. If naval intelligence is to remain relevant and deliver penetrating insight and decision advantage against great power competitors, it must improve data analytics and invert the 80/20 paradigm.
Since conception of the information warfare community, the professional trajectory of intelligence officers (1830s) has been toward broader and more generalized skills across the range of information warfare disciplines, as opposed to specialization in a particular geographic area or threat. Despite this shift, intelligence officers still are expected to be predictive and provide operators with accurate threat assessments. It is imperative, therefore, that the intelligence community develop the intuitive data analytic tools and practices to expand intelligence officers’ time for in-depth, predictive, cognitive analysis.
A successful analyst knows where to find data but understands “you don’t know what you don’t know.” Databases, applications, and tools to store and discover intelligence data are multiplying at a dizzying rate. Are there resources that are not being considered? Questions not being asked? Additional data that could better inform/substantiate an assessment? Time prevents an endless search, and at some point, an assessment will have to be made based on data discovered. Artificial intelligence (AI), machine learning (ML), and deep learning (DL) applications can significantly reduce the time required to discover data, expand the aperture for relevant data discovery, and present information to intelligence analysts in a manner to allow for in-depth analysis of an adversary.
AI, ML, and DL are cyber and big data industry buzzwords that are frequently interchanged and confused, but they can be explained as follows:
- AI is “the broader concept of machines’ ability to carry out tasks in a way that we would consider ‘smart.’”2
- ML is providing machines data and the means to learn for themselves (a subset of AI).3
- DL (a subset of ML) “allows applications to more accurately predict outcomes without specifically being programmed.”4
A simple analogy is a car approaching a traffic signal. ML would enable the car to determine if the light is red, yellow, or green. DL would enable the car to detect pedestrians, other cars, lanes, and weather and then predict future movement based on the interrelationship of all of these factors.5 The car’s ability to execute ML and DL functions and learn from the environment is an example of AI.
In a similar manner, the Navy demands from its intelligence apparatus a comprehensive analysis of the operational environment. Intelligence must anticipate movements and derive intentions of adversaries through a deep understanding of the interrelationships of all factors in the maritime environment (weather, tactical situation, electromagnetic spectrum, pattern of life, and cyberspace, to name a few). ML and DL assist in providing relationships and presentation of models while reducing the time an analyst would need for data discovery. Reduced data discovery time allows for greater scrutiny of data analytic models and more rigorous analysis for thorough intelligence assessments.
In the 2019 AIM Initiative—a strategy for augmenting intelligence with machines—the Office of the Director of National Intelligence identified machine learning models as intelligence community assets. Data analytic models and algorithms considered as nonkinetic assets on the level of an offensive cyber weapon or kinetic ordnance are required to achieve the level of significance for funding research, development, and maintenance in the Department of the Navy budget.
Admiral Scott Swift noted in May 2018 that the Navy must “move past the objective, measured realm of the science of warfare and into the subjective, uncertain realm of the art of warfare” and devote time and effort to “analysis of alternatives needed to be predictive.”6 It is critical that more time be devoted to cognitive analysis of an adversary and less on discovery of adversary data.
Admiral Swift’s remarks complement Cynthia Grabo’s writings in Anticipating Surprise: Analysis for Strategic Warning, where she distinguishes between measurable and intangible factors. Data analytics should discover the measurable factors (the scientific, objective measurements that can be extracted through data) to complement analysis of the intangibles (the uncertain human elements that influence an adversary’s motivations and intentions).7 The effective blend of algorithms providing AI options/solutions/models for analysts’ analysis and interpretation will lessen uncertainties in the art of warfare.
Estimates suggest the intelligence community uses anywhere from 1 percent to 5 percent of data collected; the result is a large percentage of unused data that may be of some use to inform and train ML algorithms.8 Properly managed AI, ML, and DL tools will discover the utility in both used and unused data, retraining and improving data models. The Navy and the intelligence community, through numerous projects and initiatives, are quickly discovering the potential of AI.
Composing Data Analytics
Without a conductor, an orchestra is just a group of disparate instruments and sounds. Similarly, all the different data analytic advancements need a conductor to bring them into harmony. The Department of the Navy Chief Information Officer (DoN CIO), acting as the “conductor” for the Navy, should work in unison with the Directors of National Intelligence and Naval Intelligence to ensure the development and viability of data analytic initiatives.
Multiple strategies published over the past few years urge a convergence of resources for improved data analytics:
- Intelligence Community Information Environment Data Strategy: Fosters sharing and reuse of validated algorithms, analytics, and related tradecraft.9
- Department of the Navy Strategy for Data and Analytics Optimization: Encourages community participation to maximize efficiencies in data.10
- AIM Initiative: Aims to “achieve superiority by adopting the best available commercial AI applications and combining them with [intelligence community]-unique algorithms and data holdings to augment the reasoning capabilities of our analysts.”11
Orchestration also will require a reliable, capable cloud architecture to consolidate all the tangential investments from the public and private sectors and DoD innovations. The DoN CIO can improve naval intelligence by working with the Department of Defense (DoD) CIO, who is responsible for the Joint Enterprise Defense Infrastructure cloud program, which aims to improve shared intelligence and data analytic efficiencies among the services.To support sharing intelligence and data analytic efficiencies, the DoN CIO established a Data and Analytics Consortium and DoD IT Portfolio Repository/DON Applications and Database Management System.12 Maintenance, funding, and community investment in these initiatives are needed to make this an enduring effort.
On 27 June 2018, then–Deputy Secretary of Defense Patrick Shanahan “directed the DoD CIO to establish the Joint Artificial Intelligence Center (JAIC) in order to enable teams across DoD to swiftly deliver new AI-enabled capabilities.”13 Establishment of the JAIC is one of many initiatives and organizational developments the Navy and intelligence community are engaged in to advance AI and data analytics. Others include:
- In September 2018, the Defense Advanced Research Projects Agency announced a multiyear investment of more than $2 billion in new and existing programs called “AI Next,” attempting to produce more explainable AI models while maintaining a high level of learning performance (prediction accuracy).14
- The Intelligence Advanced Research Projects Activity composed a draft solicitation for a Space-based Machine Automated Recognition Technique program to “lift the burden of human analysts” and is developing easy-to-implement tools for geospatial intelligence analysts to analyze, assess, and use big data.15
- The Nimitz Operational Intelligence Center’s Maritime Domain Awareness and Advanced Analytics Division is an established office to maximize information sharing and data analysis.16
- Naval Sea Systems Command established a Cyber Engineering and Digital Transformation Directorate to align its CIOs, system engineers, and program executive offices, while leveraging the DoN CIO and Deputy CIO offices, to align digital infrastructure and enterprise to improve efficiencies in cybersecurity, IT missions, and data analytics.17
- The Naval Postgraduate School Data Science and Analytics Group was established to make NPS the “thought leader, educational nexus, and primary research coordinator for data science and analytics in the DoD.”18
AI innovations in intelligence will differ from other DoD AI initiatives, such as adjusting fire of a weapon system based on target battle damage assessment or creating weaponized robots to handle rugged terrain. Instead, the intelligence community should seek a complementary cognitive function from AI. Analysts require AI to thoughtfully search for meaningful data, explore the value of discovered data, and provide a synopsis—and do it as quickly as possible.
AI should be a catalyst for the analytic process. For this process to mature, a mechanism must be in place for analysts’ feedback, so the AI algorithms can learn from current assessments to inform future analysis. The questions driving data discovery, value of data, and data analysis are identified at every operational command in the priority intelligence requirements (PIRs) and essential elements of information (EEIs). Maintenance of PIRs and EEIs will become even more critical as they will provide the boundaries for AI algorithms to exploit collected data. Satisfied PIRs/EEIs will provide the necessary feedback loop to train algorithms to continue looking for the “right” data.
Effective employment of data analytics requires a concerted effort from the tactical to the strategic levels. Sailors will be required to develop and update intelligence requirements and questions that need to be answered. Policymakers will need to ensure shared tradecraft across the enterprise. Network architects will need to develop a backbone to support discovery and analysis of data. Data scientists need to be recruited or trained to understand employment of data analytics in intelligence. Somewhere in the middle (between the tactical edge and policymakers) a conduit is needed to blend requirements with the tradecraft.
Maritime Intelligence Operations Centers (MIOCs) will become critical nodes for delivering intelligence analysis, driven by AI strategies, to the fleet. Captain Dale Rielage suggests embedding individuals with technical expertise in information science at the centers as a way to empower a human-machine dream team.19 Data scientists at the MIOCs would be able to develop a deep understanding of fleet intelligence requirements and tap into the appropriate resources for exploiting data through cloud services. The MIOCs also could provide the quality control check on data analytic models and projections to support regional commanders against great power competitors.
Conflicts with nation-states with comparable capabilities to U.S. forces will be won through quick, decisive maneuver on the battlefield. To provide penetrating insight and decision advantage to overcome an equally matched adversary, naval intelligence requires both data analytic investments to improve data utilization and an expanded ability for cognitive analysis of intelligence information to improve decision making.
Investments in AI among the great powers will continue to increase for the foreseeable future. The Pentagon’s 2017 unclassified budget estimated $7.4 billion was spent between AI, big data, and cloud investments.20 China is expected to invest $7 billion, just in AI, through 2030.21 Russia lags behind, with AI investments of an estimated $12.5 million, with the primary focus being improvements to electronic warfare.22
Naval intelligence, through the DoN CIO, should look beyond organic AI developments. Commercial AI investments have grown from $300 million in 2011 to around $16.5 billion in 2019.23 The Navy could find that many data analytic challenges private-sector companies have overcome could be used to improve data mining and models in naval intelligence.
Inverting the 80/20
Discovering efficient methods to mine, exploit, and present data in an informative, understandable way will allow intelligence analysts more time to process information, apply necessary cognitive reasoning, and formulate stronger intelligence assessments—inverting the 80/20 data discovery/data analysis paradigm. Ultimately, improved data analytics will shift intelligence assessments from “possible” and “probable” to “likely” and “will” and will give operational commanders the decision advantage to anticipate adversary actions, deter them, and maintain favorable regional balance of power in the maritime domain.
1. Isaac R. Porche III, Bradley Wilson, Erin-Elizabeth Johnson, Shane Tierney, Evan Saltzman, “Data_Flood: Helping the Navy Address the Rising Tide of Sensor Information,” (RAND: 7 April 2014), www.rand.org/content/dam/rand/pubs/research_reports/RR300/RR315/RAND_RR315.pdf.
2. Bernard Marr, “What is the Difference Between Artificial Intelligence and Machine Learning?” Forbes, 6 December 2016, www.forbes.com/sites/bernardmarr/2016/12/06/what-is-the-difference-between-artificial-intelligence-and-machine-learning/#163608232742.
3. Marr, “What is the Difference Between Artificial Intelligence and Machine Learning?”
4. Arvin Hsu, “Deep Learning vs. Machine Learning for Business Outcomes: A Data Scientist’s Perspective,” insideBIGDATA, 27 October 2017, https://insidebigdata.com/2017/10/27/deep-learning-vs-machine-learning-business-outcomes-data-scientists-perspective/.
5. Arvin Hsu, “Deep Learning.”
6. ADM Scott H. Swift, USN, “A Fleet Must Be Able to Fight,” U.S. Naval Institute Proceedings 144, no. 5 (May 2018).
7. Cynthia M. Grabo, Anticipating Surprise: Analysis for Strategic Warning (Lanham, MD: University Press of America, 2004).
8. Michael Brett, George Duchak, Anup Ghosh, Kristin Sharp, “Artificial Intelligence for Cybersecurity: Technological and Ethical Implications,” panel, George Washington University Center for Cyber & Homeland Security 8 November 2017, https://cchs.gwu.edu/sites/g/files/zaxdzs2371/f/downloads/Fall%202017%20DT%20symposium%20compendium.pdf.
9. “Intelligence Community Information Environment (IC IE) Data Strategy,” www.dni.gov/files/documents/CIO/Data-Strategy_2017-2021_Final.pdf.
10. Department of the Navy Chief Information Officer, “Department of the Navy Strategy for Data and Analytics Computation,” 15 September 2017.
11. Director of National Intelligence, “The AIM Initiative – A Strategy for Augmenting Intelligence Using Machines,” 16 January 2019.
12. James E. McPherson, “Designation of the Department of the Navy Deputy Chief lnformation Officer (Navy) and the Department of the Navy Deputy Chief Information Officer (Marine Corps),” 30 April 2020.
13. Aaron Mehta, “DoD Stands up its Artificial Intelligence Hub,” C4ISRNet, 29 June 2018, www.c4isrnet.com/it-networks/2018/06/29/dod-stands-up-its-artificial-intelligence-hub/.
14. Defense Advanced Research Project Agency, “AI Next Campaign,” 19 July 2020, www.darpa.mil/work-with-us/ai-next-campaign.
15. Jack Corrigan, “IARPA is Investing in AI That Constantly Analyzes Worldwide Satellite Images,” NextGov, 16 April 2019, www.nextgov.com/emerging-tech/2019/04/iarpa-investing-ai-constantly-analyzes-worldwide-satellite-images/156335/.
16. Office of Naval Intelligence, “Nimitz Operational Intelligence Center,” 20 July 2020, www.oni.navy.mil/This-is-ONI/Who-We-Are/Nimitz/.
17. Megan Eckstein, “New Cyber Office Will Unify NAVSEA’s Digital Efforts,” USNI News, 27 May 2020, https://news.usni.org/2020/05/27/new-cyber-office-will-unify-navseas-digital-efforts.
18. Matthew Schehl, “NPS Launches Interdisciplinary Data Science and Analytics Group,” Naval Postgraduate School, 2 July 2018, https://my.nps.edu/-/nps-launches-interdisciplinary-data-science-and-analytics-group.
19. CAPT Dale Rielage, USN, “Building Human-Machine Dream Teams,” U.S. Naval Institute Proceedings 143, no. 5 (May 2017), www.usni.org/magazines/proceedings/2017-05/build-human-machine-dream-teams.
20. Julian E. Barnes and Josh Chin, “The New Arms Race in AI,” Wall Street Journal, 2 March 2018, www.wsj.com/articles/the-new-arms-race-in-ai-1520009261.
21. Justin Lynch, “Why Project Maven Is a ‘Moral Hazard’ for Google,” C4ISRNet, 26 June 2018, www.c4isrnet.com/it-networks/2018/06/26/why-googles-project-maven-pullout-is-a-moral-hazard/.
22 Samuel Bendett, “In AI, Russia Is Hustling to Catch Up,” Defense One, 4 April 2018, www.defenseone.com/ideas/2018/04/russia-races-forward-ai-development/147178/.
23. Statista, “Artificial intelligence (AI) funding investment in the United States from 2011 to 2019,” statista.com, www.statista.com/statistics/672712/ai-funding-united-states/.