Y2K Problem - Programming Problem - The Need For Bit Conservation

The Need For Bit Conservation

"I'm one of the culprits who created this problem. I used to write those programs back in the 1960s and 1970s, and was proud of the fact that I was able to squeeze a few elements of space out of my program by not having to put a 19 before the year. Back then, it was very important. We used to spend a lot of time running through various mathematical exercises before we started to write our programs so that they could be very clearly delimited with respect to space and the use of capacity. It never entered our minds that those programs would have lasted for more than a few years. As a consequence, they are very poorly documented. If I were to go back and look at some of the programs I wrote 30 years ago, I would have one terribly difficult time working my way through step-by-step."

—Alan Greenspan, 1998

In the first half of the 20th century, well before the computer era, business data processing was done using unit record equipment and punched cards, most commonly the 80-column variety employed by IBM, which dominated the industry. Many tricks were used to squeeze needed data into fixed-field 80 character records. Saving two digits for every date field was significant in this effort.

In the 1960s, computer memory and mass storage were scarce and expensive. Early core memory cost one dollar per bit. Popular commercial computers, such as the IBM 1401, shipped with as little as 2 Kbytes of memory. Programs often mimicked card processing techniques. Commercial programming languages of the time, such as COBOL and RPG, processed numbers in their character representations. Over time the punched cards were converted to magnetic tape and then disk files, but the structure of the data usually changed very little. Data was still input using punched cards until the mid-1970s. Machine architectures, programming languages and application designs were evolving rapidly. Neither managers nor programmers of that time expected their programs to remain in use for many decades. The realization that databases were a new type of program with different characteristics had not yet come.

There were exceptions, of course. The first person known to publicly address this issue was Bob Bemer, who had noticed it in 1958 as a result of work on genealogical software. He spent the next twenty years trying to make programmers, IBM, the U.S. government and the ISO aware of the problem, with little result. This included the recommendation that the COBOL PICTURE clause should be used to specify four digit years for dates. Despite magazine articles on the subject from 1970 onward, the majority of programmers and managers only started recognizing Y2K as a looming problem in the mid-1990s, but even then, inertia and complacency caused it to be mostly unresolved until the last few years of the decade. In 1989, Erik Naggum was instrumental in ensuring that Internet mail used four digit representations of years by including a strong recommendation to this effect in the Internet host requirements document RFC 1123.

Saving space on stored dates persisted into the Unix era, with most systems representing dates to a single 32-bit word, typically representing dates as elapsed seconds from some fixed date.

Read more about this topic:  Y2K Problem, Programming Problem

Famous quotes containing the words bit and/or conservation:

    The public, with its mob yearning to be instructed, edified and pulled by the nose, demands certainties; it must be told definitely and a bit raucously that this is true and that is false. But there are no certainties.
    —H.L. (Henry Lewis)

    A country grows in history not only because of the heroism of its troops on the field of battle, it grows also when it turns to justice and to right for the conservation of its interests.
    Aristide Briand (1862–1932)