This html article is produced from an uncorrected text file through optical character recognition. Prior to 1940 articles all text has been corrected, but from 1940 to the present most still remain uncorrected. Artifacts of the scans are misspellings, out-of-context footnotes and sidebars, and other inconsistencies. Adjacent to each text file is a PDF of the article, which accurately and fully conveys the content as it appeared in the issue. The uncorrected text files have been included to enhance the searchability of our content, on our site and in search engines, for our membership, the research community and media organizations. We are working now to provide clean text files for the entire collection.
Aesop, who used animals to illustrate human foibles, told of the eagle that died from an arrow, the shaft of which had been feathered with one of its own plumes, to make the point that “We often give our enemies the means of our own destructionIs this what the Navy has done with its widespread and growing use of computers in its combat systems?
During a Mideast conflict, units of the Sixth Fleet are steaming close to the Lebanon coast. An EA-6B is launched from the USS John F. Kennedy (CV-67) for a routine reconnaissance of the area. The aircraft observes a foreign power’s destroyer firing on a friendly Middle East country’s gunboat. No radio communications are intercepted from the friendly vessel and no other ships are in the area; the gunboat disintegrates. The EA-6B immediately transmits a report of the incident to Washington on a portion of the Worldwide Military Command and Control Systems (WWMCCS) net. Despite repeated transmission attempts, all that Washington receives is a garbled message. Other means must be used to transmit the message, and there is considerable delay in Washington s receipt of this sensitive information.
A submarine belonging to an Indian Ocean nation is patrolling submerged off the coast of Oman. A U. S. task force, consisting of one Navy tactical data system (NTDS) cruiser and three Oliver Hazard Perry-class guided missile frigates, is steaming in the area. One of the frigates has had occasional sonar contact with the submarine. The submarine sinks a tanker flying the flag of a Mideast country, with whom the submarine’s country is engaged in undeclared conflict. This sinking occurs near the mouth of the Straits of Hormuz. The U. S. ships sound general quarters, and the officer-in-tactical command (OTC) in
the cruiser attempts to organize a search attack unit (S to localize the contact. At this point, the OTC loses ,n 14 capability with the SAU, and NTDS will not accept any antisubmarine warfare (ASW) inputs. This condition exis for the next 24 hours, and the submarine escapes. Being the only nation with warships in the area, the un States is blamed for the sinking by many countries.
In a time of world tension, a U. S. battle group
led by
the USS Ticonderoga (CG-47), is patrolling several hundred miles north of Japan. The situation becomes critic when one of the group’s escorts is attacked by airc A from an Asian country. The escort receives consider damage and extensive casualties. Although able to con tinue steaming, she has lost all fighting ability and is 0^ dered to return to port. After the attack, permission granted to the battle group commander to take ^eJens
measures within 25 nautical miles of any of his units.
battle group commander also is directed to stay beyoti
d50
ssiles
miles of any landmass. A wave of incoming cruise mi is then detected by the Ticonderoga. The battle gr° commander orders weapons released. Seconds la
U. S. missiles are launched to intercept the threat.
As the
battle group’s SM-2 missiles approach their mid-cour guidance points, they self-destruct. Some of the ene missiles are destroyed by point defense systems, but m find their targets.
Are these scenarios possible? How could such a failure of our combat systems occur? There is a weakness in our combat system structure which could transform these scenarios into chilling reality.
Over the last half century, there has been an increasing use of electronics in our combat systems. The speed of electronic systems and the work they can accomplish in an era of reduced manning make their effective use essential to the modem Navy.
Another technological trend, one which is an outgrowth of electronics, is the increasing use of computers in our combat systems. Whether it be radar, communications satellite, or gunfire control system, the computer governs the electronics. The computer also partially interprets the voluminous data the at-sea commander must evaluate. Several of our combat systems need no human intervention. They can proceed from initial detection through weapons launch in a hands-off mode. As more and more combat decisions are made by computers (either explicitly, or implicitly through the information the computer has been programmed to present), it is essential that the strong and weak points of the computer be understood.
What computers do well is respond to instructions, instructions in the very explicit form of a computer program. Computers make no value judgments and have no loyalties. They do precisely what they are told and no more. This presents one of the major problems we must solve— i.e., ensuring that computers do exactly what we want them to do.
The computer not only conveys information, but it stores and processes information. The security of this i formation is critical.
There are means at hand, however, for saboteurs to p etrate this country’s military computer systems. A c0^ puter’s software control center is its operating system the software program which directs all operations p formed by the computer. If an espionage agent should g access to the classified information stored in a compu^ system, he could retrieve it for clandestine purposes. ^ also could alter the information stored in the computet as to significantly change its operating characteristics- There are two well-known and effective subversm techniques. The first is known as the “trap door’ *nS® tion. A trap door, as the name implies, is a software ^ vice which causes information to be inserted or extract | from a computer when the latch on the door is activate ■ The trap door is a small program segment (a subrouti that is inserted into a computer’s operating system- trap door circumvents the computer’s security contro • The trap door is activated by a code, a word, or se^ quence of characters that is input to the computer. (?n activated, the trap door subroutine alters the machine sta of the computer. This alteration gives the penetrator com plete access to the computer and allows him to perform desired clandestine functions. The activation mechanic is a software code known only to the penetrator. ,, The second technique is known as the “trojan horse- A trojan horse, also called a “software mole,” is 'nse^v into an operating system utility program, which is wide y
Programs. Another firm, Defense Software Pro- eiammcrs (DSP) was also connected to the ARPANET.
had a contract with the Navy to write new operating ^stem programs for the NTDS and NCCS computers.
Was
n° chance his tampering would be discovered.
sim odler Pr°grams to perform a standard function. A Co P e example of a utility program is one that finds the into'116 3n an^*e' ®nce 3 trojan horse has been inserted its 3 Utd'ty Program, the program will not only perform kno°Veit ^unct'on (e-S-’ finding the cosine of an angle), fJWn as the “lure,” it will also carry out a clandestine u ,ctlon- This clandestine function can be designed to inf Crta^C many tasks. It can access the authorized user’s ormation data base to garble the data, or alter the infor- llt;,.10n 'n ways totally unpredictable to the user of the *ty program.2
°nsider this hypothetical case. John, a 28-year-old for C't'zen’ ts the penetrator. His political ideals were Sni >1 durin8 turbulent 1960s. He graduated from a Co a Midwest college with a liberal arts degree and a computer science minor. While at college, John had be- on rf d*s'dus'°ned with the democratic process. He devel- cef ? va8ue desire to strike a blow against what he per- ^ed to be rampant militarism, a r uCr §raduation, John joined a small electronics firm as clinician. Because they worked on several government tjle racts, this firm’s computer system was connected to computer communications network for the Defense UsJ,anced Research Projects Agency (ARPANET). John inf 3 traP door and the firm’s computer system to retrieve th °[?lation on the design specifications for NTDS3 and pin Navy’s command and control system (NCCS)4 com-
and N3^am’ extracted computer files on the NTDS NCCS operating system programs.
Put UtTr‘sed and excited by his ability to penetrate comer systerns, John decided to become an active sub- kn ^ ^ subverter is a person who, with some technical p °^edge of the computer system, knowingly and pur- pos ^ ®a'ns access to privileged information for the pur- the 6 a^ei3n§ h. A subverter may also insert artifices for a]t PUrP°se of continual information gathering or delayed ration of computer program statements.5 smg information on the design of the Navy’s com- tern and contr°l system and the Navy tactical data sys- ho ’ inserted a number of trap doors and trojan proSeS 'nto DSP’s new NTDS and NCCS operating system soft rarns‘ Dc considered the possibility that the inserted Ware might be discovered by the software program- ^ s at DSP; therefore, he inserted his subversion soft- e at the interfaces between various operating system Q^gram modules, written by separate software teams. t*le operating system was compiled and put onto tape •stribution to the various naval computer centers and Perational units with NTDS and NCCS installed, there the°^n S°^d inf01™31*00 concerning his insertion of Th f3*3 doors an<^ trojan horses to a foreign power, the *°re'8n Powcr was so taken with John’s ingenuity that t y §ave him special instructions for the insertion of other reif doors anti trojan horses which would be effective in Clng U. S. naval force projection effectiveness. With these new artifices inserted, John continued in his job as a technician, content in the knowledge that he had struck a blow for the “peace-loving” nations of the world.
The Defense Software Programmers’ NCCS and NTDS operating systems were installed on board operational units and at naval computer centers as John had foreseen. Compare the consequences of penetration and subversion to the scenarios that began this article.
Command and Control Scenario: The NCCS operating system program had a utility routine that executed when its telecommunications link was used to transmit messages to Washington. The utility routine had a trojan horse inserted in it. The trojan horse’s clandestine function was to garble messages prior to Washington’s receipt of them. The clandestine function executed whenever three parameters were met. First, information concerning an armed conflict between one of America’s allies and one of the foreign power’s allies had to be stored within NCCS’s data base. Second, the message’s reference to the position of the incident had to be within the Eastern Mediterranean. Third, reference had to be made, within the text of the message, to hostile acts committed between certain foreign powers and allies of the United States. The trojan horse software made comparison of the message text to several key words (e.g., missiles, guns, launch, firing, damage, sinking, and names of specific countries). A sufficient number of text “word matches” with key words would meet the activation parameters. Once all three parameters had been met, the message was garbled by the trojan horse software.
Tactical Data System Scenario: An ingenious trojan horse caused the U. S. task force to lose contact with the submarine. One of NTDS’s inputs is geographical positioning data from the satellite navigation (SatNav) system.
Well known to the foreign power, the U. S. Navy does not hold live-fire exercises during peacetime while on station at any of the world’s choke points. An ideal trojan horse parameter is SatNav information. When an NTDS ship is located at a choke point and NTDS is transmitting ASW information on a submerged hostile submarine, the trojan horse garbles Link 14 data. Under normal exercise conditions, this trojan horse would not be discovered simply because the geographical parameter would not be met.
Combat System Scenario: A trap door caused the SM-2 missiles to self-destruct. The enemy aircraft transmitted a special code over Mode 3 identification friend or foe. That code activated a trap door that blocked passage of accurate track information from the AN/SPY 1-A radar to the illuminators. Without terminal illumination, the missiles self-destructed in a matter of seconds. This same trap door could have been used in concert with the tactical data system trojan horse to back up the geographical positioning activation.
These scenarios could be after-action reports as well. Their feasibility may be demonstrated by true examples from the business world.
Pharmaceutical Company A developed a new drug and was making preparations to patent the formula. It anticipated that this drug would make many of its competitors’ drugs obsolete. A rival company learned about this drug through the industry “grapevine.” Executives of the rival company bought time on Company A’s computer through a dummy business. The program they ran, ostensibly for business purposes, actually caused Company A’s on-line files to be read onto a tape. The executives of the rival company then searched through the tape on their own computer system. They were able to find the formula they sought and patented it for their company several days ahead of Company A.6
A 17-year-old electronics hobbyist used the local telephone company’s computerized supply ordering system to stock up on over $1,000,000 of electronics parts. Posing as a reporter, he interviewed company employees regarding the computer’s security system. He then obtained the authorization codes to use the system. Using a touch-tone telephone, he placed orders for supplies with instructions for delivery to remote locations. Once the deliveries were made, he simply drove to the locations at night and picked up his orders. As a precaution, he monitored the phone company’s supply accounts to ensure that his thefts were held within the company’s acceptable tolerances.
In an international computer espionage case, East German agents were able to obtain sensitive financial information on thousands of West German businesses. The businesses all stored financial data on a time-sharing com-
As insidiously as the Greeks slipped into Troy, today’s “trojan horse” can be slipped into a computer’s operating system—enabling the system to not only perform its overt function, but also to carry out a clandestine function.
puter system. The East German agents formed a fictitious company and opened an account with one of the comp2' nies in the time-sharing system. With some experimentation and a few leading questions, the agents were able to access all of the financial data in the computer.7
The problem of computer security did not pose much o a threat until the mid-1960s, when the widespread use o resource-sharing systems began.8 A resource-sharing sys tern is one in which access to the computer can be mad either from the computer site (e.g., the submission ota card deck) or from a remote location (e.g., through a re mote terminal or communications link). ,
Previously, computer access could be controls through physical means. Only those individuals with tn appropriate clearances and need to know were allow® onto the computer site. It did not take long to realize that J access to the remote terminal was not secure and that communications lines were not secure, the information m the computer could not be considered properly protecte ■ Something in the machine itself was required to pr°teC this information, some combination of hardware and sot ware was needed that would control access to the informs tion'contained within.
The first comprehensive study on the subject of com puter security in a resource-sharing environment was con ducted by the Rand Corporation.9 This study, conducts from 1967 through 1969, was commissioned by the A vanced Research Projects Agency (now the Defense A vanced Research Projects Agency [DARPA]). Chaim and authored by Dr. Willis H. Ware, this study and lts subsequent report focused upon likely weak security points within a computer system and recommended po11 cies to minimize these weaknesses. Its policy recommen dations were, by necessity, somewhat vague in the area o machine access controls. The study was breaking neVJ
for<
recfr0rn concePt °f the reference monitor and other Co °mnien(lations of the panel, work was begun in 1973 to 0f struct a machine that would demonstrate the feasibility 's thSecure> resource-sharing system.13 The kernel, which for 6 VCry center and heart of the operating system, per- p s the actual checks on information requests. The Air ke Ce Program demonstrated the feasibility of a security a ^neh hut unfortunately it was terminated in 1976 before Un?rnP*ete prototype could be built. A lack of technical fu ^rstanding, of meaningful policy, and a change in lng priorities caused the termination.14 jn , e current state of computer usage is hamstrung by our arellyto ensure multilevel security. Most classified data is i 3nc^ed hy machines in a dedicated mode. A computer cje a dedicated mode when only those personnel with C] arances and a need to know, as high as the highest Writ ''Catlon level handled by the machine, are allowed to phyC. and Input programs. The input must be done at the lirti'f1Ca* comPuter site. These system configurations are stri Ca *n tkeir operational usefulness because of their rerange of users. The problems of large, multi-input , aiunications networks, however, are not at all solved °Peration in the dedicated mode.
llowing the information to leave the machine.
m°and and was therefore understandably cautious about ln8 specific recommendations to correct these weak- sses. Originally classified confidential, this study has oyen declassified since 1975 and remains one of the best erviews available in the area of computer security.
Def ^°n rece*Pt tke Rand report, several Department of Co ense (DoD) agencies began serious work in the area of f^mputer security. A number of “tiger” teams were to determine the magnitude of the problem, at- Phng to determine exactly how difficult it would be to ti netrate DoD computer systems. Not only were these sv 7 teams able to break into every one of the computer far eiT1S they targeted, they found that doing so was simpler than many had expected.10 Even systems that ^ re advertised as “secure and unbreakable” by their ■j^.acturers took a relatively short time to penetrate, mat' l^er teams were able to gain access to desired infor- system CVen oFta*n contro* over the entire computer
sej^ S' Air Force took a lead in the development of te l re COmputing systems, convening a computer security tj noiog>' panel in late 1972. The Air Force presented its panrpeam FindlnSs and research progress to the panel. The “tat conc*usi°ns were that security could not be lock '°n” t0 a comPuter after manufacture, like a pad- Ce . *t had to be designed into the machine from its conan ,lon- They realized that tacking on security fixes was hoi °f°US 10 plugging holes in a dike with fingers. The ho]jS ^ outnumbered the fingers. If the dam was going to str the waters of classified information, then a
nger design and construction would be necessary. tL e Panel decided upon the security kernel concept as Co est system to develop.12 The kernel is a part of the ^ _Puter operating system which matches the source of mformation request, the source’s clearance, and the sitication of the information with a clearance list be
It should be understood that computer system complexity does not equate to computer security. The data originally passed between NTDS ships, in its digital form, were initially considered to be secure. We realized in the South China Sea that this assumption was incorrect when we discovered that our NTDS data links were providing the North Vietnamese with information. We now encrypt the information passed over NTDS links. System complexity requires a large operating system, and this facilitates undetectable penetration. The penetrator has a larger field upon which to sow his seeds. It also makes it very difficult for system managers to weed out what the penetrator has sown.
The current instructions and regulations on computer security are realistic in their assessment of the situation. Current DoD directives state:
“Operating in a true multilevel security mode remains a desired operational goal. . . . However, testing and analysis has (sic.) suggested that this goal cannot generally be obtained with confidence due to the limitations in the currently available hardware/software state-of-the-art.”15
Yet, the operator, the supervisor, and the at-sea commander are still tasked with the maintenance of secure systems. These systems appear to a penetrator as either multilevel systems or system designs that closely approximate multilevel systems. Even the governing publication within DoD concedes that extensive cost versus risk analysis is called for before security steps become mandatory, and that the techniques required by the Automated Data Processing Security Manual do not apply in those instances where retrofit (the great majority) of security would be required.16
For policy or guidelines to be realistic, they must advise a method of behavior on the part of the subordinate that is not only desirable but also feasible. If the subordinate is not given the means by which to carry out the directed policy, how can compliance be reasonably expected? Awareness of the problem is a beginning, but action to solve the problem should not be delayed.
The presence of digital computers in strategic and tactical naval systems is vital to a modem, powerful fleet. Digital computers provide speed of execution, rapid response time, valuable support in an era of reduced manning, and are inherently flexible. The evolution of the digital computer has occurred within the last 30 years. That is why the existence of the computer security problem has only gained recognition within the last decade. Today, this remains one of the areas least understood, and therefore most debated, by computer science professionals. But, our expertise in the areas of software engineering, effective implementation of hardware components, and effective design of resource-sharing networks is still small when compared with other technical disciplines.
This country is the world leader in computer technology, with a qualitative edge based upon research. It would be negligent and foolish to blunt this edge by ignoring the computer security problem. Although the primary goal of
any computer security research program must be the protection of its own systems, the use of penetration techniques as an offensive weapon should not be overlooked.
The United States exports its computer technology to much of the world, both the hardware and software. The careful insertion of a few well-written trap doors and trojan horses into the software (or wired into the hardware) of computers sold to potentially hostile countries would be a reliable and virtually undetectable intelligence asset.
To ensure the security of naval computer systems in future decades, it is imperative that we support computer security research now. Such issues as effective control of distributed networks, the applicability of encryption, and the realization of an efficient security kernel need to be addressed and their research supported. Methods of verifying the security of a computer must be developed, and DoD policy must realistically address the computer security problem. Backup procedures in the event of computer failure, whether induced or accidental, must be emphasized. If we cannot ensure the security of our computer systems, then we cannot rely upon them in a crisis.
The computer as used by today’s Navy is a powerful tool for our defense. But, if placed in the wrong hands, the computer may also be a dangerous tool for manipulation by any potential enemy. We must not rely blindly upon our current computer networks and security procedures; this course may be as unwise as the German’s reliance upon the Enigma machine throughout World War II.
'Phillip A. Meyers, “Subversion: The Neglected Aspect of Computer Security” (Naval Postgraduate School thesis, Monterey, CA, 1980).
2Ibid.
3The NTDS is a computer program which runs on a digital computer. The number of computers required to run the Navy tactical data system software programs has increased as the tactical arena has grown in scope and complexity. Currently, on board a typical aircraft carrier, a total of 18 computers can be found with a diverse range of tactical functions. In today's fleet, more than 60 warships carry NTDS. This system has become a major link in the Navy’s fighting ability. With telecommunications links, a coordinated defense of the task force can be conducted by the task force commander, and unit commanders have at their finger tips a complete picture of the battle.
4The Worldwide Military Command and Control System is this country s P('r^^ command and control network. The Navy’s command and control system ( ^
is a subset of that network. Through the use of large land-based computers telecommunications links, the military chain of command has access to all tional units. (Jan Prokop, ed.. Computers in the Navy [Annapolis, MD. Institute Press, 1976.]) In the future, NCCS facilities will interface throug WWMCCS automated data processing equipment via dedicated and common ^ communications existing and/or under development in the defense communic' system and naval telecommunication system. The afloat nodes of the NC ^ also have internal interfaces within a single ship and other installed systems sue ^ the IC, C1C, SSES, and CVTSC as well as the ashore NCCS facilities. Department of the Navy, “Weapons Systems Fundamentals: Elements of Weap Systems,” NavOrd OP 3000, vol. 1, Washington, D.C., 1971).
5 Meyers. res
6Leonard I. Krauss and Aileen MacGahan, Computer Fraud and Countermens (Englewood Cliffs, NJ: Prentice-Hall, Inc., 1979.)
’Ibid. , qj.
“Willis H. Ware, ed., “Security Controls for Computer Systems: Report o fense Science Board Task Force on Computer Security,” The Rand Corpora i Santa Monica, CA, 1979.
9 Ibid. ■ Air
l0Roger R. Schell, “Computer Security: The Achilles’ Heel of the Electroni Force?” Air Force University Review, January-February 1979, pp. 16-33.
"ibid. .. esd-
l2J. P. Anderson, “Computer Security Technology Planning Study, TR-73-51, October 1972. „ MCp
l3R. D. Rhode, “ESD 1976 Computer Security Developments Summary,
76-2, The Mitre Corporation, January 1977.
'“Schell, pp. 16-33. pata
l5U. S. Department of the Navy, “Security Requirements for Automata- Processing (ADP) Systems,” OpNavInst 5239.1, Washington, D. C., i
ber 1972' a pwce-
l6U. S. Department of Defense, “ADP Security Manual: Techniques and dures for Implementing, Deactivating, Testing, and Evaluating Secure Rc“':' ^ Sharing ADP Systems,” DoD Inst. 5200.28-M, Washington, D. C., April
Lieutenant Grant was graduated from the Naval Academy in l?7’ j served in the USS Ouellet (FF-1077) as main propulsion assistant ^ engineer officer. He graduated recently from the Naval Postgra School and has orders to the USS Henry B. Wilson (DDG-7) as weap0 officer.
Lieutenant Riche was graduated from the Naval Academy in 1l^7- j served in the USS Towers (DDG-9) as main propulsion assistant combat information center officer. He graduated from the Naval ^ graduate School and is currently serving in the USS Hoel (DDG-I operations officer.
___________________________________ Nature’s Course__________________________
As a very green division officer in the USS Bronstein (formerly DE-1037), I found myself being used as a sounding board by a third-class radarman whose two-year-old marriage had failed to produce any offspring, much to his distress. At my suggestion, the young man and his wife consulted a doctor, who found nothing physically wrong with either of them. The doctor did some calculating and identified an upcoming date when conditions would enhance the possibility of conception. We were scheduled to be at sea that week.
Although the ship was, as always, shorthanded, I approved the inevitable leave chit and accompanied my man to the executive officer’s stateroom. The XO glanced at the date and the stated reason for the request—“Personal”—and asked, “What’s this all about?”
Before I could speak, the nervous young radarman blurted, “Sir, my wife is going to get pregnant Wednesday of that week, and I want to be there.”
The XO approved the chit without asking any more questions.
Ron Carter
(The Naval Institute will pay $25.00 for each anecdote published in the Proceedings.)
34
Proceedings / Jub