History of Y2K 
The below information is from the garynorth.com website: 

 Subject: Who Gave Us the Two-Digit Year? The Pentagon, Says Ex-Programmer. 

Link:   http://www.dallasnews.com/technology-nf/techbiz1.htm 
  (Link has apparently changed article)
Gary North Comment: 

This article tells of a former Pentagon programmer who says that in the early 1960's, he saw that y2k would hit, and he warned his superiors. They ignored him. Then the Pentagon went on to mandate a two-digit year. 

If true, then we can say: "The military-industrial complex giveth, and it also taketh away. Sorry about that." 

This is from the DALLAS MORNING NEWS (Oct. 4). 

                 * * * * * * * * * 

. . . Harry S. White Jr., a young data elements code specialist at the Defense Department, argued for using all four numbers. . . . 

"The light went on for me in the early 1960s," he said. "I could see many applications by necessity needed a four-digit year. Personnel, medical records, retirement. When you do those with two digits, it will cause considerable problems. . . . was not too popular back then." 

Two more numbers. Two extra keystrokes. If the Pentagon had  agreed, one of history's strangest calamities could have been avoided. 

Instead, at the Pentagon's urging, the first federal data processing standard for calendar years called for a two-digit year, leaving millions of computers without the ability to comprehend the turn of the century. . . . 

The first article alerting computer programmers to the millennium bug - "Time and the Computer" in the February 1979 issue of Interface Age - was written by Dallas computer pioneer Bob Bemer. . . . 

Edward Yardeni, chief economist with Deutsche Bank Securities in New York, also assails management. 

"Where are the editors? Anyone can write software, but nobody edits it to conform to a grammar," he said. "We've put together a global network in the last 20 to 30 years without any adult supervision." 

The Defense Department recognized the need for a computer grammar. It was the largest user of computers in the world, and many of the conventions it established became standard industry practice. 

The Pentagon convened a Conference on Data Systems Languages in the late 1950s that produced COBOL, the Common Business Oriented Language. 

Mr. Bemer, then with IBM, was one of the designers. His COBOL Picture Clause allowed programmers to use either four- or two-digit years for calendar dates. 

Computers also needed a way to translate data into numbers as binary code. In 1960, Mr. Bemer developed the American Standard Code for Information Interchange. 

ASCII and COBOL were two major standards that spread to computer users around the world, with the Pentagon leading the way.  Less noticed was an effort to determine how programmers should enter data elements such as units of measure, time and dates. 

A committee was formed to draft data standards for what is now called the American National Standards Institute. 

Mr. White, Mr. Bemer, Washington management consultant Vico Henriques and 45 other computer specialists in government and industry tried to write a grammar the computer world would accept. 

Mr. Bemer developed the committee's scope of work.  Mr.  Henriques was the secretary. Mr. White became chairman of a subcommittee on Representations of Data Elements. 

At first, the millennium meltdown was not the big worry. But some committee members were thinking about two calendar anomalies. . . 

It took the committee 10 1/2 years to reach a consensus. But the federal government could not wait. 

President Lyndon Johnson's Bureau of the Budget issued a directive in 1967 calling for data processing standards. Circular No. A-86 said the National Bureau of Standards, with input from affected agencies, would write the rules. Exemptions would have to get approval from the Budget Bureau. 

At the time, the National Bureau of Standards did not have a strong data processing branch. The Defense Department had done the most work. 

Mr. White remembers those days as a time when he was on the losing side of several Pentagon arguments. 

"There were a number of meetings where we discussed the [calendar] standard" and whether four digits or two were the way to go, he said. "This was always a bone of contention." 

Mr. White sent a résumé to the National Bureau of Standards and was hired as a senior systems analyst. 

Even though he was now in a position to author a standard requiring the four-digit year, he was in no position to prevail over the Pentagon's wishes. 

The federal standard for calendar dates was issued on Nov. 1, 1968, as Federal Information Processing Standards PUB 4. Mr. White won the debate over sequence. Federal programmers were to enter dates starting with the year, followed by the month and then the day. 
      
But the standard called for a two-digit year. 

"The first two positions represent the units and tens identification of the Year. For example, the Year 1914 is represented as 14, and the Year 1915 is represented as 15," FIPS PUB 4 specified. 

The standard took effect Jan. 1, 1970. 

Mr. White now feels it was a calamitous mistake. 

"I wish we could change things," Mr. White said. "But that was the political scene at the time. You couldn't go forth with a federal standard if you did not have approval or agreement from the Department of Defense." 

Many industries - banks, securities, insurance - also objected to the four-digit standard, Mr. Henriques said. But if it had become the federal government's standard, any company doing business with Washington would have been compelled to follow suit. 

"If you tell GM it's got to be four digits in all your dealings with the government, GM will shrug its shoulders and go along," Mr. Henriques said. . . . 

Gary Fisher is a computer scientist at the National Institute of Standards and Technology (the current name for the old National Bureau of Standards) and heads efforts to coordinate ways to fix the Year 2000 problem. 

He said two keystrokes may not seem like a lot today, but they made a difference in the 1960s. 

"Punch cards only had 80 columns, only had room for 80 
characters," he said. "Anywhere you could save room was  appreciated. Two digits off a year made a big difference." 

Mr. White did not give up. The federal standard recognized the need for numbers extending into the next century and allowed the use of a four-digit year. 

He said he hoped to prevail in a rematch with the Pentagon. 

Yet in the peculiar world of American standards, winning means reaching a consensus. The American National Standards Institute brings together as many major users in government and industry as it can in developing standards. The end product is more guidance than rule. The standards are voluntary. 

Mr. White's committee, with himself and Bob Bemer in the lead,  proposed a four-digit year. 

The Department of Defense balked. 

"The Defense Department was protesting out of fear that, if it became the government standard, it would be enforced," Mr. Henriques said. "Everybody was trying to crank things down so they could get more bang for their buck." 

It seemed to Mr. White and Mr. Bemer that the computer world was racing ahead of its nominal masters in government and industry. 

Mr. Bemer recruited 86 technical societies to appeal for a federal "Year of the Computer" proclamation. 

"The computer industry was exploding like crazy," Mr. Bemer said. "We said, 'We don't know what we're doing. Let's pause and take a look.' I was going to use this as a platform to sell the four-digit year." 

In 1970, President Nixon refused. Mr. Bemer said he never learned why. . . . 

                 Loose standards 

In July 1971, the standards subcommittee published ANSI X3.30-1971, a compromise standard on calendar years. Four digits were preferred, but programmers could drop two if they wished. 

A similar standard was adopted in Geneva by the International Organization of Standardization. But the two-digit year was already the norm. . . . 

Mr. White left the National Bureau of Standards in 1981 and put together a software company for churches. "We did birthdays as part of the system, so we used four-digit years," he said. "All those programs are Y2K compliant." 

He became an administrator at the University of West Virginia, retiring in Morgantown in 1997. Today, at 64, he is West Virginia  president of the Bible-distributing Gideons International. 

He said he is still amazed at how loosely the information technology industry treats standards. 

"We saw the problem coming, certainly in the 1980s, and we knew about it in the 1970s," he said. "You look at prescription drugs, and the hard rules there are for standards, and you realize something with  teeth in it should have been done with this." 

The National Institute of Standards and Technology revised the federal standard in 1988, warning of the millennium bug and saying a four-digit year "is highly recommended." 

Last year, ANSI revised its standard, calling flat-out for a four-digit year. As usual, compliance is voluntary. 

Link:                   http://www.dallasnews.com/technology-nf/techbiz1.htm 
  (Link has apparently changed article) 

The Real Food


"He that takes thought for time and takes not thought for eternity, is wise for a moment but a fool forever."

FoodStorage? Why?  What?  When?     
Where?  How?   
Grain Grinders  FoodProducts  Water   
  Return to Homepage
  Write, e-mail, or call us:

  Y2KPREPARE, INC.  [email protected]
  2451 W. Montrose Ave. 
  Chicago, Illinois 60618 
  (773) 539-1704