Computer memory was precious, and programmers made every attempt to be as efficient as possible. A common way to save disk space was to represent dates in a six-digit format-YYMMDD-instead of writing out the date. No one considered the effects of having a year format with room for only two digits. Who would have thought that code they were writing would still be functioning 30 to 40 years later in the year 2000? Programmers thought they were doing a good thing; economically it made sense, as "adding two century digits to a date field for a 100-million-record file would have added at least 100 megabytes of storage requirement to a disk that costs more than $20,000 for every 15 to 20 megabytes."
25 . . . 24 . . . 23
As the cost of housing rose in the 1970s, a new method of financing became common: the 30-year mortgage. Banks quickly discovered that their accounting methods could not handle the amortization of a loan over 30 years. When the years rolled into 2000, the bank computers annotated this as "00" but interpreted it as 1900. Corporate accounting practices quickly changed to accommodate these long-term loans. Insurance companies and other organizations involved in long-term projects followed suit. The rest of the computer world, however, was virtually unaffected and uninterested.
13 . . .12 . . . 11
In 1989, the Social Security Administration began a massive effort to analyze and correct potential year-2000 problems. With 25 million lines of code in production and 25 million more in development, it took the Social Security Administration four years to study master file and database date representations. Only after this analysis did they even begin looking for tools to help them solve their problems.
The Social Security Administration received no additional funding for their effort to prepare for the year 2000. All of the costs incurred were extracted from existing budgets. So far, they have expended more than 100 employee years, and expect to expend 500 more. The Social Security Administration plans to have all year-2000 changes in place by the end of 1997 for testing in 1998, with full implementation of year 2000-compliant systems in 1999.
10 . . . 9 . . . 8
Society developed a "just-in-time" philosophy; the use of automatic tellers, fast food, and microwave ovens was a way of life in the early 1990s. Computers were smaller, faster, and cheaper, and they were everywhere-work, school, and home. There was a virtual explosion of software, hardware, and users.
Despite the fact that 20 years had passed since the first indication of computing problems resulting from the year-2000 rollover, little had been done. Even the Social Security Administration's action to prepare for the problem has had no effect on other federal agencies. The phrases of the year were: "It's not our problem. It won't affect us. We'll get to it . . eventually."
6 . . .
Another segment of the population encountered year2000 problems for the first time when the renewal dates for drivers' licenses failed. Information technology professional Jim Gillespie states, "We had our first year 2000 heads-up in 1994, when renewal dates for six-year commercial drivers' licenses gave us a problem. There were two or three other minor failures as well, which went a long way in helping us convince management that this was an important project."
5 . . .
A flurry of activity began near the end of 1995. Suddenly, 2000 no longer seemed a long way off. The New Year's Eve countdown to 2000 was less than five years away. Panic struck the federal agencies. Information technology professionals were directed to study and analyze their systems for potential problems. Warnings were sounded from the military to industry and beyond.
The Commander, Operational Test and Evaluation Force (ComOpTEvFor), issued a message in late 1995 that disseminated the Naval Information Systems Management Center (NISMC) language for year-2000 compliance to provide direction to the Department of Defense. The NISMC language defined year-2000 compliance as: "faultfree performance in the processing of date and daterelated data (including, but not limited to, calculating and sequencing) by all hardware and software products, individually and in combination. Fault-free performance includes manipulation of data when the dates are in the twentieth and twenty-first centuries and shall be transparent to the user."
On 29 November 1995, the MITRE Corporation began an assessment of the effects of the year 2000 on software applications. A team of MITRE analysts studied nine command-and-control systems and two automated information systems and analyzed more than 5.4 million lines of code. MITRE found that between 1% and 5% of the code would be affected by the year change, and determined that correcting the command-and-control systems would require from 8 to 71 staff years per million lines of code and cost $1.00 to $8.52 for each line. The automated information systems code would require 9 to 16 staff years per million lines of code to correct and cost $0.75 to $1.70 for each line. MITRE also found that even though some commercial tools were available to find and fix problems in popular languages, the Department of Defense used numerous legacy systems and systems written in specialized code for which there were no quick fixes available. As a result, MITRE recommended that all Department of Defense systems be analyzed during regular maintenance and that special efforts be made to analyze systems that were not scheduled for regular maintenance in the near future. They also recommended that the Department of Defense establish a centralized clearinghouse for information and use a decentralized approach for finding and fixing code.
In a 5 December 1995 message to all commands, the Chief of Naval Operations passed on the findings of the MITRE study and warned the Department of Defense of the many potential problems that could arise in software, hardware, and firmware as a result of the year 2000. The message directed that recipients route the message to all appropriate information-technology staffs for action, and stated that precautionary measures should be taken during routine maintenance when possible and that new systems should be warranted for fault-free date performance through the year 2100.
At last, the problem had been identified and defined. There were fewer than five years left until 2000. It had taken the Social Security Administration four years simply to analyze their systems, scrape funding from their existing budgets, and figure out how extensive the problem was. They expected that it would take ten years to complete the entire project. With fewer than five years to go and no money allocated, system administrators and information technology professionals faced bleak prospects.
4 . . .
After the problem analysis and awareness campaign of 1995, officials began the next steps-establishing rules and regulations to govern changes and continuing efforts to raise the awareness of top levels of management. Establishing rules and regulations was easy; convincing senior management was nearly impossible.
In January 1996, Anthony Valletta, Deputy Assistant Secretary of Defense for Command, Control, Communications, and Intelligence, spoke at a conference sponsored by the Information Technology Association of America. He discussed the Department of Defense view of the year 2000 problem and made it clear that senior management needed to be convinced of the magnitude of the situation. Because there was no "silver bullet" and no additional funding, resources would have to be reallocated from often-inadequate existing budgets. Tradeoffs would need to be made that would go beyond information technology budgets, requiring careful analysis of priorities.
The National Institute of Standards and Technology (NIST) issued a change to Federal Information Processing Standard (FIPS) 4-1 on 26 March 1996 in order to assist government organizations preparing for the year 2000, stating: "For purposes of electronic data interchange in any recorded form among U.S. Government agencies, NIST highly recommends that four-digit year elements be used. The year should encompass a two-digit century that precedes, and is contiguous with, a two-digit year-of-century (e.g., 1999, 2000, etc.). In addition, optional two-digit year time elements specified in ANSI X3.30-1985 (R1991) should not be used for the purposes of any data interchange among U.S. Government agencies."
On 31 May 1996, the Assistant Secretary of Defense sent a message to the Department of the Navy as part of a campaign to increase awareness of year-2000 computing problems that might arise. The message directed recipients to review the contents of a World Wide Web page ( http://www.doncio.navy.mil/y2k/year2000.htm  ) for information. Enclosed with the message were both a data gathering worksheet and a report form, which were to be used to inventory computer systems and to report status by 10 July 1996, and then again semiannually until all problems were found, fixed, and tested.
A report published in August 1996 indicated that federal agencies were not prepared-or preparing-for the year 2000. The Government Reform and Oversight Subcommittee on Government Management, Information, and Technology reported that of the 24 agencies surveyed, 14 had not even formulated a plan. In the Department of Defense Minutes, the new year-2000 program manager in the Office of the Secretary of Defense, Carla von Bemewitz, reported that of the 36 organizations required to report year-2000 status by 15 September 1996, only four had complied.
Congress passed the Information Technology Management Reform Act (ITMRA), which dramatically changed government information technology practices. In addition to repealing the Brooks Act procurement processes, the ITMRA also abolished the Federal Information Resources Management Regulation, the Information Technology Delegations Program, and the Information Resources Procurement and Management Review Program; implemented the Federal Telecommunications Standards; and transferred the Federal Automated Data Processing and Telecommunications Standard Index to the National Institute of Standards and Technology.
These changes set forth new standards for federal information processing. Systems now would conform to industry and international standards set by NIST. The changes also established guidelines for handling information technology issues through specific delegation of authority and responsibility. This delegation was paramount in prompting action on the year-2000 problem.
One of the explicit purposes of the ITMRA was "to increase the responsibility of officials of the Office of Management and Budget (OMB) and other federal government agencies, and the accountability of such officials to Congress and the public. . . ." It gave the director of the OMB responsibility for "the effective and efficient acquisition, use, and disposal of information technology and other information resources by the executive agencies" and to "encourage and advocate the adoption of national and international information technology standards that are technically and economically beneficial to the Federal Government and the private sector."
With these responsibilities and through a House appropriations bill, the director of the OMB was tasked with providing Congress with a plan for conversion to year 2000-compatible systems. On 10 September 1996, Sally Katzen, administrator of the OMB's Office of Information and Regulatory Affairs, reported that the OMB's strategy assumed that when managers became aware of the potential consequences of the year-2000 problem they would take action, and that it would take engineers and technicians to solve the problem, because there was no easy fix. The strategy had three components: raising awareness, sharing expertise, and removing barriers.
A House subcommittee provided little hope that agencies would receive additional funding for year-2000 projects. Instead, the chairwoman of the House Science Subcommittee on Technology suggested that a Sense of Congress resolution be passed to focus attention on the problem, but not to make any rules about how to handle it. Another subcommittee staff member suggested that since the Social Security Administration had worked on the problem since 1989 without additional funding, funding was not an issue.
The subcommittee had an extremely limited understanding of the scope and complexity of the year-2000 problem. Experts estimate that the global cost of conversion to 2000 compliance will be $300 to $600 billion. Some estimate that testing will account for 50% to 70% of the time and resources required to solve the problem. The estimated U.S. government cost alone is $30 billion. Yet neither Congress nor the White House wants to increase the budget. In fact, the Sense of Congress provision in the ITMRA states: "It is the sense of Congress that executive agencies should achieve a 5% per year decrease in the cost incurred by the agency for operating and maintaining information technology, and a 5% per year increase in the efficiency of the agency operations, by reason of improvements in information resources management by the agency."
The language of the problem has been defined. An extensive force has been tasked and rules of "combat" have been established: cut technology costs by 5% every year while increasing the efficiency of information systems by 5%, and fix the year-2000 problem with no additional resources. Bear in mind that the Social Security Administration expended more than 100 employee years in the first stages of their project; fixing the year-2000 problem will not add any functionality to information systems; and time is running out.
In an effort to support the year-2000 conversion process, the Defense Authorization Bill proposed by the Senate for fiscal year 1997 would prohibit the Department of Defense from purchasing hardware or software that is not year 2000-compatible-eliminating the need to convert new systems, but adding to the complexity of making systems interact properly. Emmet Rige, Assistant Secretary of Defense for Command, Control, Communications, and Intelligence, called a halt to activity on contracts or programs that have not addressed the problem and ordered that all future contracts must provide assurance that all new systems will be year 2000-compliant.
The Federal Acquisition Regulation Information Technology Committee developed a year-2000 contract clause to provide a uniform method of procurement to prevent additional year-2000 problems. The contract clause states that the product delivered to the agency "shall be able to accurately process date data (including, but not limited to, calculating, comparing, and sequencing) from, into, and between the twentieth and twenty-first centuries, including leap-year calculations...."
The clause is recommended for voluntary use in solicitations and contracts for hardware, software, and systems, in order to ensure year-2000 compliance. The requirements in the clause will terminate upon the turn of the century, or when an agency decides to accept offers that are not year-2000 compliant with the provision that they will be sufficiently upgraded before the year 2000. The clause also contains notes for solicitations and new contracts and for existing contracts. The note for existing contracts recommends "that agencies negotiate modifications to existing contracts for acquisition of new products using the above clause as a guide. Prior to modifying the contract, the project team must ensure (1) that performance is possible considering the characteristics of the existing products, (2) the suppliers' agreements with the integrator will allow this work to be performed, (3) cost of performance will not be prohibitive, and (4) that the contractor will agree to the modification (should be a bilateral modification). The Government may elect to acquire versions of those products that warrant accurate performance in the processing of date and date-related data."
3 . . . 2 . . . 1
With only three years left, computer experts have so much to do in so little time. The acquisition rules have been changed, the instructions written, and the surveys completed. Now is the time to act. Differentiating between the systems that are critical and those that can be allowed to fail must take place in 1997, because there no longer is time to fix them all. Most likely, there is not even enough time to repair and test the critical systems.
One of the most valuable and significant outcomes of year-2000 compliance efforts is pooling all possible resources to address the problem. Agencies are working together to avoid duplication of effort. Many working groups, agencies, and committees are accessible on the World Wide Web. There are numerous year-2000 home pages, with hot links among them.
Yet many barriers remain. Top management has not grasped the reality and urgency of the problem. Many agencies understand that the problem exists, but do not have-the knowledge or expertise to begin to understand the effect on their systems, or where to begin to find out what it might be. Some are caught up in analysis and blame; others simply are overwhelmed. Funding is a major issue and will become increasingly important as the countdown nears a close. The costs of analysis and repair are excessive, and finding enough money through trimming budgets soon will be impossible. But the biggest barrier is time; this project cannot be postponed. No extensions are possible, as we cannot slow the ticking of the clock.