Cryptology at the Crossroads
(See R. Bebber, pp. 30–35, March 2015 Proceedings)
Commander Charles Turner, U.S. Navy (Retired)—Lieutenant Bebber hits on two key areas where the Navy can regain its dominance of signals intelligence: 1) improving the quality of the technical skills and refined judgment needed in the tradecraft, and 2) developing a comprehensive yet continually evolving long-term human-capital development plan that balances technical skills and operational experience.
Successful cryptologic/information-warfare officers and enlisted personnel need to have a deep technical knowledge of the basics of electronics, the electromagnetic spectrum, and troubleshooting. Their training pipeline needs to be a demanding one that builds a solid foundation in the fundamentals before developing exotic skills. We need to return to a culture where it is acceptable to have high attrition rates from an exacting training pipeline since not everyone will necessarily be cut out for a highly technical career path. The Navy needs to lose its fascination (developed over the last 20 years) with cost savings gained from computer-based training and fewer hours in the classroom. Such a shortsighted strategy has provided an unacceptable number of technicians lacking a solid grasp of basic electronics and elementary troubleshooting.
At-sea time is a must for anyone in a sea service. Our sailors need to experience multiple iterations of the pre-deployment workup-and-deployment cycle under way on ships. Tactical-level afloat experience provides the needed opportunities to apply one’s trade and to develop critical thinking and judgment skills, and nothing develops them like working through problems under pressure at sea when you are the sole “expert(s)” to whom everyone is turning for sage advice. Cryptologists need to articulate their trade to and understand the requirements of the surface warriors, aviators, and submariners who are going to execute missions ranging from humanitarian and peacekeeping operations to trading hostile fire with the enemy. Working on the staff of a carrier strike group, destroyer squadron, or amphibious squadron following a ship’s company tour provides invaluable operational-level planning and critical-thinking skills before heading off to strategic-level assignments.
Cryptologists can gain and develop higher-level skills through advanced education, problem-solving, and collaboration efforts with centers of excellence in civilian universities and in working with corporate partners, such as Microsoft, Google, Apple, and others. Advanced degrees are key, but collaboration with sister-service and civilian peers can help us develop better solutions by seeing our problems from different perspectives. Those relationships also provide a long-term benefit by nurturing a personal network outside the Navy or the military, which can help defuse and solve future problems.
Our myopic focus on the technical problems before us pays us adversely in spades if we’re only fighting the latest crises du jour in the headlines. The ultimate goal of the Human Capital Strategy should be the existence of a cadre that is developing and refining the next generations of tools and practices needed to crack our adversaries’ signals and command-and-control systems. We need to think long-term and big-picture to execute more seamlessly in the present.
Getting and Staying Connected
(See E. Lundquist and L. Osborn, pp. 48–53, February 2015; and J. G. Foggo III, p. 8, March 2015 Proceedings)
John R. Potter, Principal Strategic Development Officer, Centre for Maritime Research and Experimentation (CMRE)—Captains Lundquist and Osborne’s article is an interesting read, underlining the simple fact that the key to any kind of multi-platform operation is interoperability, which can only be founded on standards of some kind, most particularly at interfaces. As Western defense budgets come under increasing pressure, and the debilitating aspects of asymmetric warfare sap resources at an alarming rate compared to the effect that traditional stand-alone approaches are able to deliver, we have to find ways to get substantially more for less out of our equipment and resources. To do this, I believe that the investment to create common architectures and communication protocols will pay huge dividends, making a scenario like the one in the article a possibility rather than a wishful dream, as it is now. Not only must we find ways to effectively link heterogeneous assets together, operating in different modalities and media, but we must also do so across the 28 nations of NATO, with their diversity of suppliers.
Standardization and systems-architecture design is at the core of CMRE interests, where we have compelling value to offer as a not-for-profit (no industrial competition or agenda) multinational “honest broker.” The work is difficult and often frustrating, requiring a great deal of patience and perseverance, but establishing common architectures and standards that empower interoperability is ultimately indispensable, allowing disparate forces to collaborate and adaptively plan and control just the kind of scenario that is painted in the article. To shy away from taking on this challenge condemns us to irrecoverable and enormous inefficiency costs in the future, when collaboration and coordination will be essential features of efficient interdiction.
The CMRE is currently working on several standardization tasks, including the control and behavior of autonomous assets, and the first digital underwater-communication standard (JANUS) that will create the possibility to send digital information between heterogeneous subsurface assets, even between human and robotic platforms. JANUS also provides a common language platform to bootstrap the creation of heterogeneous underwater networks. We look forward to becoming involved in many more such projects in the future.
The Trouble with High-Tech
(See D. Majumdar, pp. 42–47, February 2015 Proceedings)
Nicholas E. Efstathiou—Mr. Majumdar presents interesting questions in his article, rightly raising the moral and ethical issues regarding the usage of unmanned autonomous vehicles. Protective civilian-watchdog organizations such as Human Rights Watch are concerned with a rise in independent, artificial intelligence that has the capacity to react tactically to ongoing situations and to make critical decisions in real time regarding mission completion. These organizations argue that such artificial intelligence (AI) could lead to a loss of operator control over autonomous vehicles, thus placing humanity at the mercy of machines in a science fiction–like scenario where humanity could face extinction from weapons no longer stoppable.
This is not all that fantastical, as anyone who has worked with even the most basic of systems can attest to. Issues arise in programs, whether the machine in question is a General Atomics drone or a child’s remote-control monster-truck toy, and there is always the possibility that something is going to go wrong. But while the monster truck’s failure is benign, the drone may have the ability to vaporize a suburban neighborhood.
While Human Rights Watch argues that autonomous vehicles with AI should never be created, this is simply not practical. Someone, somewhere, is developing such a program. The science has come too far for it not to proceed, much like the steady evolution of the atomic bomb into the nuclear missile. Progress marches on. The United States, while it does not necessarily need to be at the forefront of creating science-fiction doomsday robots, should continue to develop the programs as a safety measure, as a method of understanding such systems and the ways in which they might be stopped—either by hacking into their internal on-board systems or by developing the countermeasures necessary to take them out of the sky.
And while the two sides of the issue seem to be firmly entrenched with one calling for the ban of all autonomous vehicles and the other seeking the expansion of them, there is a middle ground that should not be ignored. This would be the area of partial autonomy. Trained operators could oversee missions with the autonomous AI craft being taught to alert the operator when something outside of the mission parameters appears. This would allow operators to monitor multiple vehicles while still affording the command an increase in operationally available vehicles.
While this would not result in the large personnel reduction mentioned by Mr. Majumdar as a benefit of the autonomous program, it would still be a slight force reduction. Arguments that training operators to monitor multiple vehicles and missions in real time at once would be difficult are specious. The United States has a generation of computer-savvy young people who are intimately familiar with a wide array of technological devices and innovations. This generation, for whom technology holds no fears, is far more capable than many give them credit for.
Redesign the Procurement Process
(See N. Friedman, pp. 90–91, February 2015 Proceedings)
Marty Bollinger, Visiting Executive Lecturer, Darden School of Business—Dr. Friedman should be praised for an excellent synopsis of the government side of the industrial base in his column. Many tend to forget that our current defense industrial-base model, formed around (theoretical) competition across large corporations largely focused on the U.S. government, is a recent American occurrence. This model didn’t emerge until the mid-1990s, and is generally unknown in prior U.S. history: In 1988 the average large U.S. defense contractor generated only 20 percent of total corporate sales from the federal government. Moreover, it is a model that few other countries have chosen to emulate, preferring instead to maintain some form of arsenal model, whether it be government-operated, merely government-owned, or a regulated national monopoly.
However, there are two small errors that should be corrected in the interest of accuracy. First, it was not Grumman that failed to deliver the A-12. The Grumman/Northrop team didn’t even submit a final bid on the program—perhaps they knew just how hard it was going to be. It was the McDonnell Douglas/General Dynamics team that won the award but then, along with the U.S. Navy, failed to deliver. Second, it is untrue that the Naval Aircraft Factory did not build airplanes. In fact, it produced more than 300 manned aircraft as late as World War II, converted several dozen others, and continued to produce very small numbers of experimental unmanned aircraft through the early 1950s.
‘Distributed Lethality’
(See T. Rowden, P. Gumataotao, and P. Fanta, pp. 18–23, January 2015; P. E. Pournelle, p. 8, February 2015; and W. P. Hughes Jr., pp. 8–9, March 2015 Proceedings)
Captain James P. Adams, U.S. Navy (Retired)—The blueprint that surface Navy leadership provided in this article gives the reader an excellent insight to how we want to fight the next war. Unfortunately, the “distributed” part of the equation from a command, control, communications, computers, combat-systems, intelligence, surveillance, and reconnaissance (C5ISR) perspective needs more development and Fleet warfighting training. Examples I saw firsthand on the Commander, 2nd Fleet and Fleet Forces Command N6 staffs are:
• What is Navy leadership doing to train the Fleet and shore-support infrastructure on degraded and lost Internet (NIPRNET and SIPRNET) capabilities caused by coordinated kinetic and/or cyber attacks on shore and afloat command, control, communications, computers, and information-technology networks? An adjunct question is, since the Navy–Marine Corps Intranet (NMCI) was brought to the Navy, how much did we exercise the shore IT system since it is essential for war? How often have we divorced and operated the Navy network from the civilian World Wide Web, which was an original selling point by the eventual NMCI vendor winner to Fleet Echelons II and III command staffs? What are we doing with the next generation of afloat and ashore networks for training on them as we do with our engineering, navigation, and combat systems? Will the logistic areas to support our crews and ships work with no network and only rudimentary communication systems? Who is in charge of the coordinated afloat and ashore training—the Information-Dominance Corps or the type commander (TYCOM)? You cannot simulate this training adequately in trainers ashore.
• How confident is Navy leadership that the sailors on board our ships can repair battle damage to the afloat networks or satellite-communications (SATCOM) systems without reach-back ashore and parts on board during crisis periods?
• How will “distributed” work in a long-term electronic-silence situation? We used to train and operate in such an environment for weeks at a time. My guess is that the TYCOMs and Fleet Certification Teams do not have the time to adequately stress this area. We should not assume that any SATCOM system or ashore network will be operable during the next conflict.
• How proficient is the Fleet in the use of backup afloat and ashore communication paths? How capable has the acquisition community been in delivering new ship classes without significant C5I interoperability issues or the material and training for the ship crews to pass electromagnetic-interference (EMI) certification? For some ship classes the time for a single ship in that class to pass EMI certification could be years. A good portion of the EMI certification is the ability of the ship to fully use high frequency.
There are other examples that point to the significant relationship between surface afloat systems and shore IT systems. To ensure we have the best warfighting capability, Navy leadership needs to train its focus on this area. The seams between C5ISR systems and other ship and ashore systems require a hard look.
When Quality Slips
(See N. Pettigrew, pp. 58–64, January 2015; D. Bolgiano, p. 8, February 2015; and R. M. Rosenthal and T. Rishel, p. 86, March 2015 Proceedings)
Captain Matthew Fenton, U.S. Navy (Retired)—Mr. Pettigrew’s hard-hitting article summarizes the Navy’s quality-assurance problems with building and repair contracts very well.
In my experience managing contracts for both the Navy and Maritime Administration, a key problem is the way that government agencies handle the issues that are identified by conscientious inspectors. If work is found to be substandard, inspectors quickly become aware that bringing an issue up and insisting on its resolution can be a career-jeopardizing move.
Instead of viewing the government as a valued customer, the contractors usually treat it as a victim. Contractors are quick to complain about inspectors, and the government all too often caves in to these complaints to the extent of removing inspectors from a project or blackballing them altogether instead of properly resolving the issue. Many government contracting officers and their technical representatives have close relationships with contractors, and feeding work to their cronies is effectively a job perk.
The government does not do enough to insist on quality work and does a poor job of training inspectors. Too often, the Navy absorbs cost overruns while the contractors laugh all the way to the bank.
Relying on contract personnel to maintain frontline combatants is dangerous. One of the reasons for the Navy’s readiness and success is its cadre of well-trained sailors, particularly in engineering rates. Their knowledge and ability will ensure survivability in combat. While outsourcing is appealing from a financial standpoint, it provides a window of vulnerability for our enemies to exploit.
Perry Miller—Mr. Pettigrew makes a strong argument that the lack of quality oversight of shipbuilding contractors leads to cost overruns and construction problems. However, I would like to suggest an even more insidious contributor: the Jones Act (P.L. 66-261, aka the Merchant Marine Act of 1920).
This 95-year-old piece of antiquated protectionist legislation, based on long-discredited economic theory, completely forbids foreign shipbuilders from competing with American shipyards. While a convincing case might be made for taxes and tariffs on foreign competitors, this outright ban on competition will only lead to—has, in fact, led to—the slow death of the very industry it is meant to protect. Why? Because what that industry is really being protected from is the need to improve, innovate, and economize. In the words of former chairman of the Federal Reserve, Alan Greenspan, “If the protectionist route is followed, newer, more efficient industries will have less scope to expand, and overall output and economic welfare will suffer.” (David B. Sicilia and Jeffrey L. Cruikshank, The Greenspan Effect, McGraw-Hill, 1999). Because of the Jones Act, American shipbuilders now account for less than one half of one percent of the global market (see www.globalsecurity.org/military/world/shipbuilding.htm).
Mr. Pettigrew’s recommendation of more and better quality oversight might help, but the range of problems itemized in his article—from plumbing to “construction inefficiencies”—suggests to me that the required scope of expertise, and the sheer number of experts needed for such oversight, may be impractical and not cost-effective. If, on the other hand, American shipbuilding was a competitive industry, the contractor would have to provide its own effective oversight in order to remain competitive and stay in business. I submit that the Jones Act has destroyed the American shipbuilding industry—and I fear that it may someday bankrupt the U.S. Navy.