In 2015, Chinese hackers scooped up millions of personnel files and security clearance records from Office of Personnel Management (OPM) databases. OPM’s lax security practices, specifically its failure to implement industry-standard security measures such as two-factor authentication, allowed Chinese security services hackers easy access to a treasure trove of sensitive data.
This is not an isolated incident. Over the past decade, headlines announcing major data leaks have become common. Small businesses, Fortune 500 firms, and state and local governments have been attacked. The federal government, including the military, is not exempt; cyber actors continually probe the entire national security information architecture seeking access. These attacks are often successful: A glance at the Chinese J-31 fighter suggests similarities to the F-35, and their littoral combat ship (LCS) looks almost like a carbon copy of the U.S. Navy’s Independence-variant LCS.1 Information warfare will play an unprecedented role in the next conflict.
Preventing the compromise of technical information is a national security issue, because allowing hostile actors access to this information weakens the tools used to secure the nation. This is not just a strategic issue. At a tactical level, safeguarding an operational unit’s planning documents and intelligence products is crucial, because an adversary with access to friendly information systems will be aware of the unit’s intended operations. For example, an enemy who knows that friendly forces have created topographic studies for a specific axis of advance will prepare to resist along that route, eliminating the element of surprise.
The Department of Defense (DoD) is engaged in a war for information, and success relies on the security of its networks and equipment systems. Designing and fielding information systems capable of preventing compromise or weapon systems that cannot be hacked is building a defense in the information environment—a hard position from which to deny the adversary an advantage.
Yet, DoD, specifically its acquisitions community, has lagged in implementing cybersecurity controls. A 2018 Government Accountability Office (GAO) report, tellingly titled Weapon Systems Cybersecurity: DOD Just Beginning to Grapple with Scale of Vulnerabilities, revealed that between 2012 and 2017, DoD testers routinely found major cybersecurity flaws in weapon systems in development. During these tests, red-team personnel equipped with “nascent to moderate tools and techniques” were able to easily “disrupt or access and take control” of weapon systems because of insufficient or improperly configured security controls. Most damning was the report’s revelation that many cybersecurity flaws had been identified in previous system testing, but the solutions had not been implemented.2
The rate of technological change has long outpaced the military acquisitions community, which struggles to increase the tempo with which it designs and fields systems. As such, the dominant theme in acquisitions today is speed—rapid development, production, and fielding to ensure up-to-date technology gets in the warfighter’s hands as quickly as possible. The emphasis on rapidity likely comes at the cost of thoroughness, probably a contributing factor to the cybersecurity issues GAO identified.
How can DoD square the demand for speedy equipment development with the need for secure equipment systems, while minimizing cost to the services and increasing the capability provided to the fleet? The following case study outlining the best practices and lessons learned during a recent cybersecurity test event should help answer this question and provide material for consideration in future cybersecurity tests.
Testing Cybersecurity of a New Imagery System
The Distributed Common Ground/Surface System–Marine Corps Geospatial Intelligence (DCGS-MC GeoInt) system is the newest addition to the Marine Corps intelligence community’s equipment architecture. The computer modernizes the intelligence family of systems by providing greater analytic power, while reducing the various devices’ size and weight footprint. As a geospatial intelligence system, it most likely will be a priority target for enemy cyber actors seeking indicators about the intended operations of friendly forces.
Because of the urgency of fielding the capability, the DCGS-MC GeoInt Program Office had less than 30 months from receiving the program designation to the system becoming fully operational in the Fleet Marine Force. The system’s sensitivity led to its designation as an Acquisition Category IV (T), requiring an operational test before fielding. I was assigned as the operational test project officer and responsible for the test.
Given the speed with which DCGS-MC GeoInt had to be fielded, and the unique nature of an information technology asset, my team determined that the normal process of operational testing did not apply. There was no need, for example, to determine the system’s reliability, availability, and maintainability, as would usually be determined in a full operational test event, since the operational unit can easily replace most of the system’s components at minimal cost.
It was crucial, however, to determine the extent to which the system could perform its assigned missions in a cyber-contested environment. So, we decided to forgo evaluating the system’s suitability and effectiveness, focusing instead on its survivability. DCGS-MC GeoInt was the first Marine Corps equipment system to undergo a cybersecurity-only operational test. Program offices are required to test cybersecurity controls as part of the system-design process, and DCGS-MC GeoInt’s cyber security controls were rigorously checked before operational testing began. However, as GAO noted, previous identification of security issues does not necessarily mean they have been fixed. Cybersecurity-only operational testing provides value to the warfighter because an independent entity assesses the state of security before the device is delivered to the fleet.
The cybersecurity-only operational testing process contains three major events: a cooperative vulnerability penetration analysis, a cybersecurity tabletop exercise, and an adversarial assessment. Each event has a different purpose, and they are meant to build on each other, so that later events are informed by data collected throughout the process.
The cooperative vulnerability penetration analysis (CVPA) examines the system to determine whether major vulnerabilities exist because of improper system configuration.3 For DCGS-MC GeoInt, because the equipment systems were undergoing reliability testing at the developer’s facility in Logan, Utah, and the operational test team was based in Quantico, Virginia, we decided to conduct the CVPA virtually through a logical connection between the systems in Utah and testers in northern Virginia. This saved time and money by not requiring the devices to be shipped and reconfigured in a new location. The CVPA results were documented, analyzed, and then released to the program office.
The cybersecurity tabletop exercise brought together system operators, program office engineering experts, the operational test team, and red team members, who would conduct the adversarial assessment. Using the CVPA results, in combination with the system diagram, the exercise was to generate estimates of the time it would take to affect the system. This event took a week, and the outputs were fed into the analytic model used in the adversarial assessment.
The adversarial assessment is the cybersecurity evaluation process capstone event. The red team attempted to access and affect the system while Marine operators were using it. The red team’s efforts were documented so the test team could determine how long it would take a peer adversary to breach the system. Data collected were analyzed to inform the decision to field the system, and, as of this writing, the system is being shipped to the fleet.
The DCGS-MC GeoInt adversarial assessment differed from standard assessments in two significant ways. First, because intelligence systems operate on the Secret Internet Protocol Router Network (SIPRNet), the test team, with program office support, determined the adversarial assessment should occur on that network. This may seem like common sense, but standard practice is for operational tests to occur on a separate network to prevent the possibility of red team cyberattacks affecting the rest of the network. While other systems have been tested while connected to SIPRNet, intelligence systems are different in that their proper functioning requires interaction with databases and servers that reside only on SIPR, making their connection to the network a vital functionality component as opposed to an ancillary capability.
The second way this test was different was in the system’s complexity. The DCGS-MC GeoInt system is a computer designed for data processing and analysis, so it is far more complex than the information technology infrastructure of a tactical vehicle, for example, or any other items on which the Marine Corps had performed this type of cybersecurity operational testing.
To perform the evaluation, the test team received personnel support from the Marine Corps Intelligence Activity (MCIA) and Marine Corps Forces Cybersecurity Command (MarForCyber). These commands provided two certified imagery and geospatial intelligence analysts who built products for the MCIA while the MarForCyber red team hackers attempted to disrupt their efforts. Whereas operators participating in an operational test normally work on a constructed scenario, and the products of their efforts are discarded at event’s end, our arrangement in this test allowed the operators to build real-world products, saving more than 160 man-hours of work by two trained operators.
The DCGS-MC GeoInt adversarial assessment broke new ground by being the first operational test of a Marine Corps intelligence system on a live classified network, the first operational test of a system used by operators creating real-world intelligence products, and the most complex Marine Corps system cybersecurity testing to date.
Future Testing Considerations
The next war will be a war of systems: Network and device security will be as vital as physical terrain was in the last conventional war. Government systems are not prepared to succeed in this environment.
The DoD acquisitions community plays a substantive role in this fight. By identifying and fixing cybersecurity issues while systems are in development, the acquisitions community can prevent vulnerabilities from becoming a risk to the operational user in the fleet.
Because the intelligence system information is so sensitive, every information technology program within the intelligence community must undergo a full cybersecurity evaluation that includes an operational test. As the DCGS-MC GeoInt test demonstrated, with an approach that recognizes the material solution nuances and a willingness to flexibly test only the relevant characteristics, an operational test authority can add value to the program office and minimize the evaluation’s financial and time costs, all while improving the system quality and security delivered to the warfighter.
The current paradigm in military acquisition calls for operational testing to inform the decision to field a system to the fleet. But for cybersecurity-only tests, this requirement could be relaxed to allow for continual testing before and after the system has been shipped. Since the CVPA should validate the device’s security configuration, assuming all other programmatic requirements are met, the system could be sent to the fleet and then evaluated once in use by fleet operators.
Cybersecurity operational testing differs from standard testing in that the vulnerabilities identified are rarely the result of design characteristics that require physical alterations to the system. Whereas an engineering flaw requires garrison- or depot-level maintenance before the equipment can be used, vulnerabilities discovered in a cybersecurity evaluation can be patched after the system is fielded through regular software updates. Embracing this approach would save time and speed the development of equipment systems.
The DCGS-MC GeoInt adversarial assessment and data-evaluation process, already expedited because of the system fielding schedule, took more than two months to complete. Holding the operational test after a system has been shipped to the fleet would have major advantages to the warfighter and taxpayer.
As demonstrated during the DCGS-MC GeoInt CVPA, as long as the systems are connected to a network, they can be probed remotely by red team members located in a different facility. A fleet system in Japan could feasibly be evaluated by network-connected operators in Virginia, and vulnerabilities discovered could be fixed remotely through the software patching process. This approach would not apply to every system—some are not attached to a network—but for those that could be evaluated remotely, this approach would speed the acquisition process by cutting two months off the system development schedule.
Operational tests are required to be held in an “operationally representative” environment, which usually means during a large exercise or a discrete test event. By conducting the test on an operating force unit’s systems, assuming appropriate deconfliction has occurred to prevent impact to real-world operations, the test team will create a more accurate event (a literal operational environment vice an operationally representative one) and one in which the efforts of the operators are spent on real intelligence rather than creating products that will be discarded at the end of the event. This improves the value of the test while minimizing impact on the supporting operating force units and costs to the program office. The DCGS-MC GeoInt adversarial assessment proved this can occur.
Performing cybersecurity evaluations after the system has been fielded allows for repeat testing. The GAO report noted that major weapon systems contained vulnerabilities that had been identified during testing yet went unfixed. If a major vulnerability is identified during an operational test, the operational test authority could convene a second test once the system is fielded to validate that the vulnerabilities have been addressed.
Holding the test event in an operational environment also incentivizes the program offices by giving them a stake in getting the system online in the fleet. Usually, after a system is fielded to the fleet, it is the responsibility of individual users to configure the system so it complies with the authority to operate, the document that outlines how it will operate on the network. For the Marine Corps, this means individuals at four separate nodes (the three Marine expeditionary forces and MCIA) all working independently to get the system operating, in addition to their regular duties. The DCGS-MC GeoInt adversarial assessment was held on the live SIPRNet, so the program office had to coordinate getting the systems online and functional before the test event started, instead of leaving it up to fleet operators after the system was fielded.
Standardizing this requirement would alleviate the burden placed on the fleet. Whenever a new information system is fielded, a streamlined process would allow the most knowledgeable entity, the program office, to coordinate on behalf of the operational customers.
1. Siobhan Gorman, August Cole, and Yochi Dreazen, “Computer Spies Breach Fighter-Jet Project,” Wall Street Journal, 21 April 2020.
2. James C. Bussert, “Coastal Catamarans Serve Chinese Littoral Needs,” Signal, 1 June 2015.
3. Government Accountability Office, Weapon System Cybersecurity: DOD Just Beginning to Grapple with Scale of Vulnerabilities, GAO-19-128 (Washington, DC: October 2018).
4. Director Operational Test and Evaluation, “Cybersecurity OT&E—Guidance,” Washington, DC, undated.