The article "The Y2K Nightmare" by Robert Sam Anson that we have included on the Y2KPREPARE site has a history and summary of the Y2K problem. The article is long but gives an excellent perspective on the history of the Y2K problem. The article also clearly indicates the problem of human nature which has caused and continued this programming flaw. As you will see the biggest problem with Y2K has been and will continue to be human flaw and fraility. There is a serious "inertia problem" with human nature which we usually call "procastination". However, the reasons for procrastination are many. But the Y2K problem seems to have its origin in pride, arrogance, ignorance, and greed. While the potential problem has been known almost from the beginning of computer history, because of these attitudes it has remained uncorrected and unchanged. Human nature, including programmers and the rest of the intelligensia, are like lemmings which are racing at breakneck speedy toward the precipice of worldwide economic and social destruction.
As the saying goes, "The best of men are only men at the best". Man's attempt to raise himself to godhood via technology, computer technology, the internet, etc. are only re-capitulations of the "Tower of Babel" - an attempt to replace God with human ingenuity, effort, and achievement. We at Y2KPPREPARE have nothing against science, technology and progress, but we do not believe in worshipping at the modern day shrine of "scientific progress". By the way, the word "science" is simply a word of Latin origin which means "knowledge". There is no magic in the word "science" itself. Legitimate science is based on the "faith" principle. One of the primary principles and assumptions of modern "science" and the "scientific method" is that we live in an orderly, organized world which can be studied. From these studies we extract principles of predictability and relationship. As the older theologians said, "We think God's thoughts after Him". God is not taken by surprize by the "discoveries of science".
Will Y2K be the end of the world? No, but it might be the end of your world. That is the real issue for you. Are you prepared for the end of your world? As we all know from experience - death is 100%. Are you Compliant? Are you Y2K Compliant? Are you HEAVEN Compliant - Do you possess eternal life?
If you are concerned about more than this life click here (eternal life for now).
http://www.remarq.com/default/transcript.pl?group=comp.software.year-2000:50034064:50034064&update=1770
Vanity Fair January 1999
Article: 1 of 1 From: tom mcdowell <[email protected]> Subject: 12.31.99 The Y2K Nightmare -Vanity Fair- January 1999 Date: Sun, 27 Dec 1998 15:11:19 -0500 Will the millennium arrive in darkness and chaos as billions of lines of computer code containing two-digit year dates shut down hospitals, banks, police and fire departments, airlines, utilities, the Pentagon, and the White House? These nightmare scenarios are only too possible, Robert Sam Anson discovers as he traces the birth of the Y2K "bug," the folly, greed and denial that have muffled two decades of warnings from technology experts, and the ominous results of Y2K tests that lay bare the dimensions of a ticking global time bomb.
The Y2K Nightmare
By Robert Sam Anson
The nightmare scenario goes like this:
It is an instant past midnight, January 1, 2000, and suddenly nothing works. Not ATM.'s, which have stopped dispensing cash; not credit cards, which are being rejected; not VCRS, which now really are impossible to program. The power in some cities isn't working, either; and that means no heat, lights, or coffee in the morning, not to mention no televisions, stereos, or phones, which even in places with power-aren't working, either. Bank vaults and prison gates have swung open; so have valves on sewer lines. The 911 service isn't functioning, but fire trucks are on the prowl (though the blaze had better be no higher than the second floor; since their ladders won't lift). People in elevators are trapped, and those with electronic hotel or office keys can't get anywhere, either. Hospitals have shut down because their ventilators and X-ray machines won't work and, in any case, it's now impossible to bill the H.M.O.
Traffic is a mess, since no streetlights are working. Trains are running, but their control switches aren't, which is bad news for supermarkets, utilities, car dealers, and international trade, which can't move by ship either. Only the brave or foolhardy are getting on airplanes-but with so many countries degenerating into riots and revolution, it's wiser to stay at home anyway. There are no newspapers to read or movies to go to or welfare checks to cash. Meantime, retirees are opening letters saying that their pensions have been canceled because they are minus-23 years old. Many banks and small businesses have gone bust, and it will be weeks-if ever before the mess that is the broker's statement is sorted out.
On the brighter side, no one can punch a time clock; on the darker; most of the big manufacturing plants have shut down because their lathes and robots aren't work- ing. Pharmacies aren't filling prescriptions; the D.M.V. is not processing license renewals, and everyone's dashboard keeps flashing SERVICE ENGINE Now. Mortgage payments sent on time have been marked late, and everyone's phone bill is messed up because of all those calls that began in 1999 and ended in 1900. On the Internet where thousands of Web sites are suggesting how to find God and when to move to the wilderness the acronym for what's occurring is TEOTWAWKI: The End Of The World As We Know It.
Will it happen? "Yes," "No", and "Maybe," say the experts. And that's the most unnerving thing about the phenomenon variously known as "Y2K," "The Year 2000 Problem," or "The Millennium Bug": No one will know the extent of its consequences until after they occur. The one sure thing is that the wondrous machines that govern and ease our lives won't know what to do. Some will freeze, electronically paralyzed; others will become imbecilic, giving idiot. answers and issuing lunatic commands; still others, overwhelmed, will simply die -as will the blind faith the world has placed in them. "The reason we are in this screwup," says Paul Strassmann, who oversaw the Pentagon's vast computer operations during the Bush administration, "is that ... Americans fell in love with computers and put up with ... failures that they would not have put up with in a crummy toaster or microwave."
The product of this folly is a looming disaster with an immovable deadline that will touch the entire world. Its scale is unique, too: 1.2 trillion lines of potentially lethal software code located in virtually every country, plus 30 billion microprocessors. Many are linked-computer to computer; network to network, place to place -everywhere from birthing rooms to crematoriums. And if potential catastrophe is to be avoided, every line and chip must be checked-a task variously likened to building the pyramids, changing all the light bulbs in Las Vegas in an afternoon, or individually polishing enough marbles to fill the Grand Canyon.
The global cost of putting everything right is estimated to be as much as $3.6 trillion, according to Capers Jones, author of the 1997 book The Year 2000 Software Problem. This includes lawsuits, which some expect will total $1 trillion in the United States alone. But squandered treasuries are just the beginning of the misery. Because of Y2K there are predictions of a recession matching the oil shock of 1973-74, hundreds of thousands of bankruptcies, and a disruptions of government services from police protection to food inspection. The chairman of the New York Stock Exchange, Richard Grasso, has warned of "potentially disastrous consequences"; a senior executive at Barclays, the British bank, reportedly advised clients to sell their homes, stockpile cash, and buy gold. And even Utah senator Robert Bennett-the most knowledgeable Y2K expert in Congress-was moved to say, "I'm not yet ready to dig a shelter in the backyard. But it might not be a bad idea to have a little extra food and water."
For some, Y2K offers rewards. Sales of survival gear are at record levels, Southern Baptists foresee "historic evangelism opportunities," and the career of singer Pat Boone-who has already recorded public-service announcements "to bring Y2K to the family dinner table for dialogue"-has been revived. The prospect of impending doom has provided fodder for Pat Robertson's Christian Broadcasting Network, which asks the faithful, "How does God want us to redeem this situation for Him?"
Some experts are more optimistic. But even the cheeriest assurances are not completely comforting. Y2K, said Jeanne Terrile, an analyst at Merrill Lynch-which has spent $375 million fixing its problems-is like the space shuttle: "It always goes off smoothly ... except once it didn't, and the country came to a standstill."
Preparing for the worst, Canada has made plans to call out the troops; Wisconsin and Iowa, the National Guard. Meanwhile, airlines are considering imposing "no-fly zones" on areas of the world not Y2K-ready. Computer experts are also getting ready. A recent survey of technology executives found that 10 percent of them planned to stockpile canned goods, 11 percent were preparing to buy generators and woodstoves, and 13 percent were going to purchase "alarm Systems, fencing, and firearms." "The problem," Intel chairman Andy Grove has warned, "is going to be pretty bad."
Belatedly, that's become apparent to the U.S. government. "If we don't fix the century-date problems, we will have a situation scarier than the average disaster movie you might see on Sunday night," I.R.S. commissioner Charles Rossotti said last year. "The whole financial system of the United States will come to a halt. It not only could happen; it will happen, if we don't fix it right." "This is going to have implications in the world and in American society we can't even begin to comprehend," added Deputy Secretary of Defense John Hamre. "I would be the last person to suggest we're not going to have some nasty surprises, because I definitely think we will." Among those surprises could be weapons-such as Tomahawk missiles or ICBMs-that won't launch when they're supposed to or will fire when they're not supposed to.
So grave are the concerns for the nation's power grid-in which a failure in one region can cascade to others-that Connecticut senator Christopher Dodd said, "We're no longer at the point of asking whether or not there will be any power disruptions but now we are forced to ask how severe the disruptions are going to be." There are similar trepidations about telecommunications, the healthcare industry, and nuclear power plants at least one of which has already failed a Y2K test. White House Y2K czar John Koskinen has said, "We could have, if not the equivalent, something that is very much like a hurricane on the East Coast, an earthquake in San Francisco, and flooding on the Mississippi River happening all at once."
Things will be infinitely worse overseas, where Y2K's impact on the delivery of food, seed, and fertilizer could result in between 10 million and 300 million deaths. When one Middle Eastern contact was told of the millennium bug, according to Sherry Burns, who heads the C.I.A.'s Y2K office, he replied, "When we see it, we'll spray for it."
The Middle East, where half the oil companies expect at least one critical breakdown, is in particularly bad shape-as are Japan, which has the world's largest banks, and China, where much of the software was pirated, not to mention Indonesia. "Asia," said Deutsche Bank Securities chief economist Edward Yardeni, "is toast. In the year 2000, Asia will be burnt toast." But the biggest jitters are over Russia, which possesses not only 11 Chernobyl-type power plants, 22,500 nuclear warheads, and the funds to fix none of them, but also an attack-warning system so vulnerable to Y2K that the Pentagon has proposed stationing officers in each other's command centers New Year's Eve 1999. "[The missiles] are probably just going to sit there and tell their operators, 'I'm confused,"' John Pike of the Federation of American Scientists said in a recent interview. "But there is a small, finite risk that this could lead to accidental nuclear war, simply be cause people fail to fix their computers."
How all this happened is an unlikely story, and it begins nearly a half-century ago with a most unlikely woman. Her name was Grace Murray Hopper, and in the field of computer science there has seldom been her like. Feisty, quick tongued, and irreverent ("Life was simple before World War II," she liked to say, "after that, we had systems"), 'Amazing Grace," as colleagues called her, racked up many distinctions during her 85-year life. They included coining the phrase "computer bug" (a moth flew into one she was working on); becoming the first female admiral in the navy; and inventing the "compiler"-the software element that translates text into the is and Os a computer can understand. But the accomplishment for which Grace Hopper is best remembered was helping to create a computer code actually useful in everyday life. Its name was common business-oriented language-COBOL, for short.
Able to perform myriad business tasks, COBOL was the Windows of its day-the program that ordered the functioning of virtually every business computer. The earliest of these gargantuan mainframes had a flaw, however. In order to operate, they relied on a dollar-bill-size piece of cardboard called a Hollerith card. Named after the inventor of the first electric tabulating machine-and used in one form or another since as early as 1890-Holleriths passed information by means of small rectangular holes punched into 80 narrow columns. "You can put a hole in this card representing one dollar," IBM chairman Tom Watson would tell customers. 'A dollar of sales, perhaps, or a dollar you owe someone. From that point on, you have a permanent record. It can never be erased, and you never have to enter it again. It can be added, subtracted, and multiplied. It can be filed, accumulated, and printed, all automatically."
All of which was true-as was the fact that "IBM cards," as Watson preferred to call them, barely had room for a name, birth date, and address. Initially, this was handled by simply using more cards-three, for instance, to record a single magazine subscription. But as time went on and businesses got bigger, companies found that they needed entire buildings just to store punch cards. This, they made clear to IBM, would not do. So, in order to squeeze in as much information as possible, programmers shortened COBOL instructions whenever they could. That included dates, which were reduced from eight digits to six by lopping off the "19" from the year. A computer would thus read "123199" and know the digits stood for December 31, 1999. What a computer could not do was realize that one second after midnight on that date it would be January 1, 2000.
So, in the manner of an odometer passing 99999 miles, the numbers would roll back to "00," which a computer would interpret as 1900-provided that the sudden loss of a hundred years didn't prevent it from functioning, period.Programmers were aware of the problem, but believed improving technology would fix it decades before 2000. And by the end of the 1950s, technology did get better, with Holleriths giving way to more compact magnetic tape. The guts of computers transformed, too, shrinking from vacuum tubes to transistors to integrated circuits. In 1964 there was another revolution when, after an expenditure of $5 billion (more than it had cost to create the first atomic bomb), IBM introduced its System/360-the first "family" of computers which could use the same disk drives, printers, and peripherals, regardless of a model's size or power. The intention, Watson's son and successor, Tom junior, recorded in his memoirs, "was to make all other computers obsolete.... Once customers shifted to System/360, they'd be able to expand their installations simply by mixing and matching components from our sales catalog. That was good for them, and the benefit for IBM was equally compelling once a customer entered the circle of 360 users, we knew we could keep them there for a very long time."
In achieving that goal, IBM succeeded spectacularly well, demolishing the competition and becoming the mainframe standard setter. Less gloriously, it also retained the two-digit year.
Holding on to old format was rationalized as a cost-cutter, particularly in computer memory-at the time the cost was $761 to store the equivalent of the information contained in a Danielle Steel novel. Considering that the same amount of memory now costs less than a thousandth of that, it was pound-foolish writ gigantic. But at the time, few realized it. "I'm one of the culprits who created this problem," the former proprietor of an economic-consulting firm told Congress last year. "I used to write those programs back in the Sixties and Seventies, and was so proud 0£ the fact that I was able to squeeze a few elements of space by not having to put 19 before the year.... It never entered our minds that those programs would have lasted more than a few years." Ordinarily expert at spotting future difficulties, the speaker was Federal Reserve Chairman Alan Greenspan.
A handful were more foresighted. Among them was Robert Bemer, an IBM wizard who had invented the "Escape" key, and was one of the creators of "ASCII," the language that enabled different computer systems to "talk" to one another. During the 50s, Bemer also developed a feature that permitted COBOL programmers to use either two or four digit year dates. A passionate proponent of the latter, in 1960 Bemer joined with 47 other industry and government specialists to come up with universally accepted computer standards. The wrangling, however, stretched out for years-too many years for the White House, which, in 1967, ordered the National Bureau of Standards to settle the matter. In so doing, the bureau was to gather input from various federal agencies, some of which were using two-digit years, others four. As a practical matter, the only opinion that counted was that of the Department of Defense, the largest computer operator on earth. For bigger-bang-for-the-buck reasons, it was unshakable on the subject of year dates: no 19s. "They wouldn't listen to anything else," says Harry White, a D.O.D. computer-code specialist and Bemer ally. "They were more occupied with ... Vietnam."
After years of losing fights, White transferred to the Standards Bureau. Hardly had he arrived when the bureau succumbed to Pentagon pressure and announced that two digit years would become the preferred option for federal agencies, starting January 1, 1970. Hoping for presidential intervention, White and Bemer rounded up 86 technical societies and asked Richard Nixon to declare 1970 "The National Computer Year." When D.O.D. lobbying kept that appeal from reaching the Oval Office, Bemer recruited the presidential science advisor, Edward E. David, to plead the case in person. Nixon listened, then asked for help fixing his TV set. Frantic, Bemer and White beseeched private organizations to call for a voluntary four-digit-year option. But once more the Pentagon's position prevailed. Mindful of government contracts, big business went along. Bemer was reduced to issuing caveats in obscure technical journals. "There are many horror stories about programs, working for years, that died on some significant change in the date," he wrote in the February 1979 issue of Interface Age. "Don't drop the first two digits [of the century]. The program may well fail from ambiguity in the year 2000." The reaction in some quarters, Bemer recalls, was laughter. Quietly, though, the Y2K word was spreading. Two years after Bemer's article was published, technology commentators Joe Celko and Jackie McDonald jokingly wrote in a trade paper, "I have a plan to make a fortune in the year 2000. I will start a company called 'Father Time Software' that does nothing but correct programs and data files that used only the. last two digits of-the year ... for keeping their records.... We will charge fantastic fees, and clients will have no choice but to pay." In 1983 a Detroit programmer, William Schoen, tried to make the idea a reality. After stumbling on the date problem while working in the basement of a Big Three automaker, he devised a $995 solution on his home PC., then set up a company to peddle it. The Charmar Correction-the answer for "the problem you may not know you have," as the American weekly Computerworld called it-had exactly two sales.
Nevertheless, warnings about Y2K persisted, including in a book written by an Illinois couple, Jerome and Marilyn Murray. Published in mid-1984, Computers in Crisis: How to Avert the Coming Worldwide Computer Systems Collapse had its genesis on a day when Mrs. Murray, an assistant vice president at an insurance company, was figuring annuities. All went well until she keyed in an annuity due after 2000, at which point the computer spat back "1900," then reams of gibberish. She reported the incident to her husband, a former IBM consultant, who went to his old colleagues for explanation. The one they provided boiled down to two sentences: This is a user problem-a crisis for which computer users alone are responsible. There is no IBM solution to the problem in this room. Predicting "domestic and international chaos" if someone didn't soon come up with one, the Murrays wrote in their foreword: "We have placed our confidence, physical and economic wellbeing, and future hope in the development of a technology now seen to be fatally flawed through collective human oversight. What have we done? What will we do?"
Two years later, in South Africa, a programmer named Chris Anderson started asking himself the same questions. The answers he came up with were sufficiently alarming that he took out a magazine ad-"The Timebomb in Your IBM Mainframe System." Big Blue responded: "IBM and other vendors have known about this for many years. This problem is fully understood by IBM's software developers, who anticipate no difficulty in programming around it."
But with every new computer-62 million of which were in use in the U.S. by 1991-the scale of the problem increased. So, too, did the complexity of fixing it. For as was disturbingly becoming apparent, "COBOL Cowboys," as rough-and-ready programmers called themselves, had worked according to whim, sometimes deliberately hiding dates (as this guaranteed they'd later need to be hired back), other times disguising them under the names of girlfriends, cars, even Star Trek characters (because this was thought idiosyncratic and amusing). Thus, "2000 - 1983 = 17" might read as "Gloria + Chevy = Spock." Not that the code needed to be so complicated.
In 1997 the Washington State Department of Social and Health Services discovered that many of its computer functions were being governed by one word: "Bob."Had the mischief been contained, the impact would have been negligible. But in the name of "downsizing" and "productivity," computers were increasingly running everything. And how they ran never stopped changing, as business kept demanding better, faster, cheaper thises and thats. In the rush, no one bothered to get rid of Grace Hopper's COBOL core. Instead, revisions were piled on top of it, layer upon layer, until a system containing hundreds of millions of records could have thousands of levels, constructed by hundreds of different programmers--each of whom had his own way of doing things. "It's as if you were building a bridge," says Dr. Leon A. Kappelman, co-chairman of the Society for Information Management's Year 2000 working group, "and let every riveter pick their own kind of rivet and drill different boles and use different rivet
guns."The result was "Spaghetti Code," an unending tangle of 0s and 1s. To decipher their meaning-and what it might hold for the millennium-one would have to possess the "source code," a frequently misplaced, decades-old document most of whose writers were retired, deceased, or, as in the case of Alan Greenspan, had gone on to other chores. "I don't remember having any significant documentation," the Fed chairman told Congress. Earlier he had said, "If I were to go back and look at some of the programs I wrote 30 years ago--I mean, I would have one terribly difficult time."
Things were made even harder by the nature of the software beast. "It cannot make a single mistake," said Greenspan, seven months after explaining that "you can be extremely scrupulous in going through every single line of code in all your computer operations, make all the adjustments that are required ... say, 'We have solved the 2000 problem,' and then find that when the date arrives, all of the interconnects that are now built in start to break down."
As the Reagan era drew to a close, few even knew that a problem existed. Bemer had retired in 1982, assuming that Y2K would be ironed out long before it did any damage. If something like Pac-Man could be squeezed onto a chip, he reasoned, how tough could it be to add two digits? Not so so confident, Harry White continued to press old co-workers at the Standards Bureau. They were sympathetic, but did nothing. "No one wanted to step up to the plate," says White. "It wasn't politically expedient. The Republicans were saying, 'The less government in your lives, the better."' So deep ran the animus toward regulation that in 1988 Congress removed individual federal standards for computer purchases altogether.
Within a year, odd things began occurring. One of the first to notice was the Social Security Administration, which was stupefied to learn that a system which tracked collection of overpayments would not accept the date when a repayment schedule went beyond 2000. "We looked at each other and said, 'Whoa!' It didn't take a lot of brains to figure out that if that piece of software acted in that way, we probably were going to have a problem with a lot of software," S.S.A. computer expert Kathleen Adams recalled to a reporter. "And we came to the conclusion that we were going to have to literally look at every piece of software and fix it." Social Security had little choice: every month, its computers sent 50 million payments to 45 million people. "Guess who those people are?" Adams asked. "My mother is one. We ... can't mess up because we don't want to mess up Mom."
Mom notwithstanding, in 1994, Social Security began sifting through what it deemed its mission-critical computer code-some 35 million lines. The work consumed three years, and when it was finished S.S.A. had to turn its attention to millions more lines at the state level. They were a trifle compared with what awaited the Department of Defense, where there were hundreds of millions of code lines running on more than 1.5 million computers, 28,000 automated systems, 10,000 networks, and thousands of individual devices-many directed by exotic programming languages for which there was no readily available fix.
The potential for mayhem was demonstrated in 1993, when curious engineers turned computers ahead at the North American Aerospace Defense Command, which alerts the president of approaching ICBMs. At the simulated stroke of midnight, January 1, 2000, every NORAD warning screen froze. It could have been worse. At other times, electronic foul-ups in Russia have prompted preparations for imminent nuclear attack.
Officially, nothing was done about occurrences like this until October 1995, when then acting assistant defense secretary Anthony Valletta attended a weekend retreat of interagency computer officials. During a break in the proceedings, Social Security's Adams stuffed a Y2K paper in his hand and launched into a hair-raising briefing. "I came running into the Pentagon on Monday and started telling some senior people," Valletta told The Boston Globe. "I got half of them saying, 'What are you, crazy?' and I got some people saying, 'That sounds like something we ought to check into."' When officials finally did, they were appalled. At the agency that outfits and equips the army, a Year 2000 computer entry deactivated 2400 inventoried items. Had the error not been caught, critical equipment would have gone missing. In still another instance, a 2000 date could not be processed by the system that keeps track of the deployment of U.S. forces worldwide. "If we built houses the way we build software," a top ranking D.O.D. official said later; "the first woodpecker to come along would destroy civilization."
With their need for years-ahead planning, insurance companies and mortgage brokers had realized that long before. But in most boardrooms there was still in denial understandably, in the view of computer experts Dale Way and Mark Raselkorn. 'Why," they wrote in a journal of the Institute of Electrical and Electronics Engineers, "would [anyone] stand up and say, "Give me $40 million and I'll disrupt our whole information infrastructure, put all of our current operations at risk and, if I'm lucky, do something no one else has done and prevent a problem many people think is not real. and will not in any case happen for years, and otherwise will contribute nothing to our bottom line'?" Way and Ilaselkorn could think of only two kinds of managers who might: the insane and those with guns to their heads.
But in suburban Toronto, Peter de Jager, a bearded, South African-born computer consultant of prodigious girth and with a similarly sized knack for phrasemaking, was about to bring Y2K to the fore. He'd known about the problem since the 1970s, but he hadn't 'paid it any heed until 1989, when he saw a documentary on the 1965 East Coast power blackout-an event brought about by the failure of a single transmission line. If something so minor could visit such misery on so many, de Jager thought, what would happen if billions of lines of computer code suddenly went screwy?
Over the next four years. de Jager soaked up everything he could about computer dates, power grids, and systems connections. "It scared the hell Out of me!" he says. "First, because of what it could do to me and my family. Then something else occurred to me: What the hell is it going to do to the world?" In September 1993, de Jager published his worries in Computerworld. "Have you ever been in a car accident?" he wrote beneath the headline DOOMSDAY 2000. "Time seems to slow down.... It's too late to avoid it-you're going to crash. All you can do now is watch it happen.... We are heading toward the year 2000. We are heading toward a failure of our standard date format.... Unfortunately, unlike the car crash, time will not slow down for us. If anything, we're accelerating toward disaster." De Jager laid out just why-and noted the squandered billions that would be the consequence. "We and our computers were supposed to make life easier;" he wrote. "This was our promise. What we have delivered is a catastrophe."
By 1995, "the Paul Revere for the year2000 computer crisis," as The New York Times dubbed him, was pounding the lecture circuit'. "If today were December 31, 1999," de Jager told audiences, "tomorrow our economy worldwide would stop. It wouldn't grind to a halt. It would snap to a halt, You would not have a dial tone.... You would not have air travel. You would not have Federal Express. You would not have the Postal Service. You would not have water. You would not have power. Because the systems are broken."
Before long, de Jager was delivering 85 speeches a year. A book (Managing 00: Surviving the Year 2000 Computing Crisis) was also in the works, along with pricey seminars on videotape and a 600,000-visits-per month Web site-www.year2000.com. Such was de lager's influence that the American Stock Exchange named a listing of Y2Kremediation companies after him. During its first year in operation, the value of the "de Jager Year 2000 Index" jumped 100 percent-two and a half times more than the Dow. Its namesake, who was reportedly pulling in more than $1 million a year; wasn't doing badly, either-a fact de Jager's critics never tired of pointing out. Some scoffed at the need for doomsaying. In 1997, David Star; chief information officer for Readers' Digest, called the clamor over Y2K "the biggest fraud perpetrated by consultants on the business community since re-engineering." Added Money, "We cope with wars, huge upheavals, natural disasters of all sorts. And now we're going to be stopped in our tracks by a computer glitch?" Even the John Birch Society joined the chorus, suggesting that Y2K could lead to a government power grab reminiscent of the Reichstag fire.
IBM, however, was taking Y2K quite seriously, and as far back as October 1995 had announced a series of steps to "assist customers in timely Year 2000 transitions." That IBM had played a leading role in creating the need for those transitions-and faced the prospect of whopping lawsuits wasn't mentioned. But the company left no doubt that big trouble was coming. "The problem is large; it's complex," IBM's press release quoted de Jager as saying. "IBM is right ... to address this issue today."
Internet publisher John Westergaard needed no convincing. His friend New York senator Daniel Patrick Moynihan was a different story: he still wrote on a typewriter. But that had not kept the two men from being close, nor had it prevented Westergaard from being Moynihan's campaign treasurer. And so, over lunch one day in early 1996, Moynihan listened intently as Westergaard spun a bloodcurdling tale of a phenomenon he'd never heard of. "I had' a fascinating lunch in New York," the senator told reporters when he got back to Washington. "A friend was talking about madcow disease for the computers of the world." He wasn't kidding, Moynihan said. "There is a bug in every computer that will cause it to go haywire January' 1, 2000, and the federal government better get its act together. If they can't pay their bills and issue checks in a normal fashion it is going to domino to all other things."
When the warning went virtually unnoticed. Moynihan asked the Congressional Research Service to prepare a report on possible Y2K consequences. What came back in June 1996 was chilling: hospital systems failing, airplanes not taking off or landing, records being scrambled-one cataclysm after another. Moynihan passed the news to Bill Clinton in a July 31, 1996, letter, along with a recommendation that the president appoint someone who would ensure that all federal agencies-and the companies that did business with them-be Y2K-compliant by January 1, 1999. "The computer has been a blessing," Moynihan closed. "If we don't act quickly, however; it could become the curse of the age."
Moynihan was not telling Clinton anything he didn't already know; eight months earlier; Howard Rubin, chairman of the computer-science department at Hunter College, had briefed the president in detail. "Clinton understood that technology is more than the Internet and pulling wires through high schools," says Rubin. "He really understood how everything was tied together [and Y2K's] potential for broad reaching consequences. He was very interested and very concerned." Al Gore was slower on the uptake. "How could this be a problem in a country where we have Intel and Microsoft?" he exclaimed when Rubin finished; Rubin shot back, "No way are you going to be able to run for office in 2000 if government systems are failing around you." Gore had no reply. "He was educatable," says Rubin, "but with effort."
Moynihan, meanwhile, was getting only silence. Finally, three months after sending his letter, he received a reply from the Office of Management and Budget (O.M.B.). It thanked' him for' the. heads-up and pledged to keep an eye on the problem.
Far from reassured, Moynihan introduced bills calling for the designation of a Y2K czar, the establishment of compliance deadlines, and a bipartisan national commission to address what was called "a devastating computer problem which will have extreme negative economic and national security consequences unless dealt with."
The legislation went nowhere, even as reports of Y2K incidents piled up. In Pennsylvania, a computer network that scheduled patient appointments at three hospitals and 75 Clinics shut down after someone punched in a visit for January 2000. In Michigan, a produce store's brand-new cash registers crashed more than 100 times when customers tried to pay with credit cards expiring in 2000. In Minnesota in 1993, officials instructed 104-year-old Mary Bandar to report to kindergarten after a computer took her 1888 birth date to mean that she was 4 years old. During a Y2K test-run at a Maryland jail, computers decided that inmates who still had time to serve were ready to be released. Industry was getting hit as well. At Amway, a mixing system for a cleaning product rejected a batch of chemicals when a computer read a 2000 expiration date as 1900. At a Chrysler plant, a Y2K dry run locked every entryway and exit and wouldn't let anyone in or out. At a Fortune 500 financial-services company, computers sent out bills for 96 years' interest.
Glitches were also surfacing abroad. In Britain, computers at the Marks & Spencer department stores ordered a consignment of corned beef discarded after deducing that the "02" sell-by date meant that the meat was 94 years old. In Canada, the computer at a university admissions office created letters on behalf of foreign students informing Canadian immigration they had graduated 95 years earlier.
There were more ominous incidents: a Y2K test at a Honolulu electric company that simulated pulling the plug on some customers and sending too much electricity to others, which would have caused fires and exploding appliances; a Y2K run-through that simulated a cutoff of gas-detecting systems on a North Sea oil platform, which would have shut down its operations under normal conditions. The list stretched on-all too uncomfortably for most of big business. In 1996 its doubts were gone and the fix-it funds were pouring in.
The strategy for killing the millennium bug was relatively straightforward, and there were several methods of going about it. All, however, involved massive infusions of manpower. To fix everything in the United States, it was estimated, would require every single one of the 1.92 million software professionals in the country to devote nearly five lull months to working on Y2K. Problem No.1 was that this was impossible. Problem No.2 was that at a normal work pace the U.S. was at least 700,000 bodies short.
For big business, this produced Problem No.3: having to pay whopping prices for the help that was available. In some instances, technology managers were commanding salaries of up to $1 million a year. In others, senior programmers from the United Kingdom (a favorite recruiting ground for talent) were demanding and getting-$7500 per day, plus Concorde flights home every weekend. When, despite the lavish perks, the programmer shortage persisted, global investor George Soros and the prime minister of Bulgaria offered a solution: hire low-cost techies from the Balkans.
Luring programmers was only the first step, however. Then followed months of inventorying, analyzing, and repairing, and an equal number of months testing the results. At each stage, the size of the problem grew. One U.S. bank thought it had 60,000 PC.'s; on second look, it came across 50,000 more. At General Motors, where there were two billion lines of computer code and 100,000 outside suppliers-any one of which could shut down the assembly lines-most executives believed that there would be no Y2K troubles on the factory floor. Then a new team came in and. found what were described as "catastrophic problems" at every plant. 'With each new discovery, costs spiraled. 'While watching his company spend $500 million on Y2K fixes in 1997 and 1998, AT&T chairman and C.E.O. C. Michael Armstrong said, "They were given an unlimited budget and they managed to exceed it."
But money alone was no guarantee. of success. On average, every 7 out of 100 software repairs produced a new computer error. The fixed 2000 deadline was also trouble, since only one in five major software projects ever finishes on time. Then there was something called the "Crouch-Eclilin Effect," which, if valid (about that, opinions were divided), could cause a fully fixed computer to randomly switch dates, wipe out data, make wrong calculations, and perhaps not start up at all.
That was not the end of the problems. In a once-every-400-years exception within the Gregorian calendar; 2000, unlike 1900, 1800, and 1700, is a leap year--a fact programmers had neglected to tell their electronic charges. Some companies have already learned this the hard way. A New Zealand minerals-processing plant, for instance, discovered an unplanned for extra day in 1996. Production lines stopped, the liquids hardened, and by the time everything was straightened out nearly three quarters of a million dollars in damage had been done. Halfway around the world, a similar glitch turned away 50,000 people who wanted to play the Arizona lottery. To add further vexation, the F.B.I., according to the New York Post had turned up yet another Year 2000 threat: the Mafia. Sniffing ill-gotten gain, organized crime was recruiting Y2K technicians to plant "trapdoors" in computers, which at an opportune moment would funnel corporate funds into Mob accounts.
Capitol Hill also began to stir. In the House, California congressman Stephen Horn's Subcommittee on Government Management, Information, and Technology began assigning preparedness grades for federal agencies. The first report card, released on July 30, 1996, was "not something you'd like to take home to your parents," Horn said: "F" for Labor, Energy, FEMA, and Transportation (whose chief was indifferent to the problem); "D" for Veterans Affairs, NASA, Justice, Interior, HUD, Health and Human Services, Agriculture, Commerce, and the E.PA. (none of which had any action plan, or any estimate on how much a fix would cost). On the Senate side, Robert Bennett's Y2K subcommittee was having trouble getting straight answers. "People lie to us or they refuse to talk to us," he said. Referring to his attempts to raise awareness abroad, he later said, "I've met with C.E.O.'s of major foreign corporations, pleading with them to get involved with this. And I meet with blank stares.... Nobody cares."
Y2K, meanwhile, was looking more and more like a job for the Sorcerer's Apprentice. The first new mess was the discovery that virtually all the 400 million desktop computers in the world-including the 100 million used by U.S. business-were century-bugged as well. Cleansing each would require only a few hours. The catch was that some companies had strung together tens of thousands, and a glitch missed in one could infect all the others.
In 1995, British engineers found that microprocessors-the dime-size "embedded chips" contained in everything from escalators to microwave ovens-were Y2Ksusceptible as well. The good news was that less than 10 percent of them carried dates, and of those only a fraction would have problems. The bad news was that there were at least 30 billion embedded chips around. the globe, and no way of telling which were at risk without checking all of them. And that would be tricky, since many chips contained .dates for no reason (though this made them no less dangerous). In other cases, identical chips functioned differently-one Y2K-secure, its twin anything but. Testing did not ensure a sound sleep, either; as some chips would function fine through the millennium, only to fail years later.
The last quirk was the killer: most chips were so ingeniously tucked away that they were impossible to fix. Instead, the equipment they ran-whether a $5 toy or a $119,000 medical imaging device had to be replaced. When the various problems were totaled, the cost of making the world safe from chips was put at $300 to $600 billion, more than the price tag of the entire Vietnam War, according to Merrill Lynch.
Not until 1997 did word of this disaster-in-the-making reach the United States. "No body thought of it," says David Hall, senior computer consultant at ACS Technical Solutions in Chicago. "[Microprocessors] don't look like the stuff with flashing lights you see at IBM, and they don't look like the PC. sitting on your desk. It's hard to realize if you don't work with them all the time, these things actually run off software." When the message was finally delivered, during testimony before Horn's sub committee, Ann Coffou, a senior manager with the Giga Information Group, sketched Armageddon: elevators halting, 911 services collapsing, pacemakers failing, water pipes bursting, fax machines stopping, weapon systems switching on. "The overall effect on the economy worldwide," she said, "could be monumental."
As if to underscore the warning, reports of trouble began appearing. At a Boston bank, so severe were the Y2K glitches in cash-advance terminals that all of them had to be junked. Meanwhile, visa, MasterCard, and American Express waited more than a year to issue cards that expire in 2000, until they could be sure their point-of-sale machines would accept them. That, at least, was only an inconvenience. During a Y2K test at a manufacturing plant, a flawed chip switched a pressure-monitoring system to default, which could have blown up a steampipe. Identifying the source of such mishaps wasn't easy. In a Y2K test that could have led to the shutdown of a power plant, the trouble was traced to a single chip in a temperature sensor located in a chimney. Reflecting on the food that might go undelivered, the heating oil unpumped, the electricity ungenerated-all because of bits of silicon smaller than a postage stamp. --Dr. Leon Kappelman has said, "This isn't the inconvenience part where your paycheck comes a few days late. This is the blood-in-the-streets part."
But the White House was serene. "We have a high degree of confidence that the important services and benefits will continue through and after the new millennium," said O.M.B. official Sally Katzen, the administration's Y2K' point person, in September 1997. "It is my expectation that when we wake un on January 1 in the year 2000 the millennium bug will have been a non-event." Counters Maryland congresswoman Connie Morella, who co-chairs a Y2K task force, "It's the American concept that if there's a computer problem, we don't have to worry.... Bill Gates or someone will come up with a magic bullet."
Microsoft's C.E.O. and chairman had no such plans. Y2K was a mainframe problem-and a "pretty simple" one at that-he told the press after hosting a "summit" of 100 industry C.E.O. 's in May 1997. All anyone needed to do to solve it was to move off of mainframes and onto PC. systems (which, happily enough, run on Windows).
Critics noted that Windows 3.0, 3.1, and 95 all needed fixing. But Gates wouldn't give ground. "There is no problem with programs," he insisted, turning the blame onto those that use them. "There is no problem with PC.'s and with packaged software." As for the alarm Y2K was generating, that, Gates said, was the fault of those who "love to tell tales of fear." Not that he seemed to mind. Thanks to the worries about Y2K, be said, the personal computer industry could expect "a little bit of a windfall."
These comments infuriated Y2K experts. "He's not only slowed the whole effort down by a year," Y2K consultant William Ulrich says of Gates, "he's also guilty of having the problem and downplaying it. Sin on top of sin."
Finally, in March 1998, after Jason Matusow, Microsoft's Y2K strategy manager, admitted that the company's having been "slow in addressing this issue" was "a mistake," Microsoft erected a Web site listing Y2K-related issues with dozens of products. "It's surprisingly complex. for something that seems so simple," Gates's lieutenant later said. "The issue is not in any one component. It's the mix of components that's so dangerous."
The same was becoming increasingly clear in Washington, where, at a number of agencies, progress on Y2K moved backward. The Agency for International Development, for one, had been given an "A" on Horn's list report, for announcing plans to replace its. entire computer system. Now, A.I.D.'s grade had. dropped to "F" because the computers it bought were not Y2K-compliant. These troubles, though, paled in comparison to those of the Federal Aviation Administration .(F.A.A.).
The problems began with arithmetic. According to the F.A.A,, it would meet the Y2K conversion deadline with time to spare; according to the data on which it based the claim, more than 60 percent of its computers would not be running on New Year's Day 2000. The F.A.A. also appeared to have difficulty determining when employees would retire, since the official appointed to work on Y2K had recently done just that. Lethargy was also an F.A.A. vice: by February 1998, its work on Y2K had fallen seven months behind schedule.
Worse, 34 of the 100 F.A.A. systems on which the flying public's safety depended would suffer "catastrophic failure" if not re paired. And that would be tricky, because IBM said the computers were so old, it would not fix them. "What is contingency planning for the F.A.A.?" asked Yardeni. "Binoculars?"
White House computers hadn't been fixed, either, and as yet no one fully knew their readiness for the year 2000. Frustrated, Horn cornered Clinton at the summer 1997 congressional picnic. "Look," he said to. the president, "you've got to give leadership. The person you most admire is Roosevelt. And his most famous phrase is 'We have nothing to fear but fear itself.' You need to explain that to the American people in a fireside chat." Clinton promised he would, and Horn sent him Y2K materials. But that was their last contact. "Normal Clinton behavior," says Horn. "They play this game of going right to the edge." Having gotten nowhere with Clinton, either, Morella turned her attentions to Gore, lobbying him during a December 1997 White House reception. "I want the president to say something," she told Gore, "and we should have a [Y2K] czar." She added that he would fit the bill perfectly. "It's hard to find the time,"
Gore said jokingly. "Why don't you do it?"Finally, in February 1998, Clinton signed Executive Order 13073, which set up the Council on Year 2000 Conversion, a high level group that was to help fix the government's "mission-critical" computer systems some 7,336 in all. The man in charge was O.M.B. Deputy Director for Management John Koskinen.
Announcing that he would personally demonstrate his confidence in the nation's electronic systems by flying from New York to Washington millennium New Year's Eve, Koskinen went on to say, "There is no indication at this time that there will be major disruptions for the American people."
Others weren't so sanguine. Within weeks of Koskinen's appointment, the General Accounting Office (G.A.O.) issued a report saying that only 35 percent of the government's "mission-critical" systems -- those that the departments cannot function without--had been fixed; at the current pace, it was highly dubious the rest would be finished by 2000.
The press, for its part, lambasted the administration. "When it's time to talk technology, Vice President Gore never seems to be at a loss for words," wrote Stephen Barr and Rajiv Chandrasekaran in The Washington Post "But when it comes to the Year 2000 computer glitch, arguably the nation's most pressing technological problem, Gore has been strikingly silent. There have been no public speeches, no 'town hall' meetings, no photo-ops with programmers."
The O.M.B. was also sounding the alarm, and so was the nonpartisan G.A.O. "No one is in charge," the G.A.O.'s top computer scientist, Rona Stiliman, told a Y2K conference. "In essence," she said, "our entire way of life is at risk."
On July 14, 1998-two years after receiving Moynihan's letter-Clinton at last spoke up. "Any business that approaches the New Year armed only with a bottle of champagne and a noisemaker;" he declared during a daytime address to the National Academy of Sciences, "is likely to have a very big hangover New Year's morning." What exactly would happen, Clinton didn't claim to know. On the one hand, he said, "the consequences of the millennium bug, if not addressed, could simply be a rash of annoyances, like being unable to use a credit card at the supermarket." On the other, "it could affect electric power, phone service, air travel, major governmental service." Whichever; he was going to ask Congress to pass a law or two to do something about it.
Almost as Clinton was speaking, the Department of Defense was admitting that of the 430 mission-critical systems claimed to have been fixed the previous November, only a quarter actually had been. "Anyone who dared report anything besides [compliance]... was worried," said D.O.D.'s Y2K chief, William Curtis. "That's because we shot the messengers."
Other agencies were also hedging the truth. A G.A.O. audit found that 15 of the systems that the Agriculture Department claimed were Y2K-ready were, in fact, only dreamed-about projects. But the F.A.A. appeared to have been honest about some things-such as admitting that it didn't know whether its new computers were Y2K-compliant-and made the extraordinary claim that it had done 14 months' worth of Y2K repairs in 6 months' time.
The gathering dismay put House Speaker Newt Gingrich in a mellow mood. "Year 2000 is a total downside risk for Clinton and Gore," he said. "If nothing goes wrong, they get no gain whatsoever. If [government computers] crash and burn. Mr. Information Superhighway has a real problem and will get a lot of heat."
As 1998 wound down, every silver lining Y2K seemed to be accompanied by a darker cloud. The government, by most reckonings, seemed to he doing better (even if the F.A.A.'s new software was losing aircraft at O'Hare), and Charles Rossotti's worries had eased enough that he could now say that key I.R.S. systems would be Y2Kcompliant by January 1999: he admitted nonetheless that I.R.S. machines had sent taxpayers bills for $300 million. But fears were growing for the private sector, where preparedness was woefully laggard for small and medium-sized companies, and even more so for those overseas, where the situation in some developing countries was of such gravity that Joyce Amenta, who was then the United Nations' director of information technology, was warning of bank panics. trade ruptures, and civil unrest. Koskinen adds, "We are in a high-wire balancing act. We've got to get people to take a serious problem seriously. On the other hand, one of our risks is. overreaction.... I need to give people the facts as we know them. I won't speculate that the world is coming to an end."
Chris Dodd wouldn't, either. But he'd seen and heard enough to know that there were three places where no one should be come New Year's Eve: "In an elevator; in an airplane, or in a hospital." Bennett-whose daughter had decided to fill her garage with food, on the assumption that, after 2000, none would be available-was only marginally more optimistic. "Of course the power grid is going to work," he said, during a Y2K Risk Assessment Task Force public forum. "That's based on the assumption that the telephones will work, and that's based on the assumption that the power grid is up. The banking systems are going to work just fine, so long as all the telephones work and as long as there's no brown-out problem on the power side. And the health-care system is going to work just fine as long as the financial system works. It's all so interconnected, we're not going to know until we go through it whether it will really work or not."
Whatever happens, the pity is that Grace Hopper will not be around to see it. The "Grandmother of the Computer Age," as she was known, once said that she hoped to live until the year 2000. "I have two reasons," she said. "The first is that the party on December 31. 1999, will be a New Year's Eve party to end all New Year's Eve parties." And the second? "The second is that I want to point back to the early days of computers and say to all the doubters, 'See? We told you the computer could do all that.'"