Bits and Bytes and Y2K

Bits and Bytes and Y2K

Several people have asked about my friend Jack, who failed to

show up to give his four lectures in our battery course a couple of

weeks ago. Rest assured he”s alive and well. The week after the

course, I got a call from him, all set to give his lectures the next

day. He had entered the wrong dates in his schedule!

One date that everyone seems to have down quite accurately, and

is either looking forward to or dreading, is next New Year”s Day.

With all sorts of predictions that Y2K will either be a disaster or a

non-event, it seems certain that the truth will lie somewhere in

between these two scenarios. I thought for those who aren”t

totally familiar with the root causes of the potential Y2K

problem, I”d share with you my own limited experiences with

computer programming and discuss a bit about “bits” and “bytes”.

My first contact with programming at Bell Labs came in the ”50s,

when I learned the rudiments of a computing language called

FORTRAN. We would write a program in FORTRAN

(FORmula TRANslator) and take it to our computer center, with

its banks of large IBM machines with their reels of tape spinning

feverishly. The FORTRAN program was “compiled”, a process

that resulted in us receiving a stack of punched cards that could

be many inches thick. This stack of cards was essentially the

translation of FORTRAN into machine language. The stack was

then run through the computer and the results printed out. Woe

unto you if you happened to lose, misplace, bend, mutilate or fold

a card.

Before going to my next programming experience, let”s look at

what it takes to store a number or character. The starting place is

the “bit”. In your computer, a bit corresponds to a transistor

circuit element being at a low voltage or a high voltage. (This

transistor element “leaks” current and has to be continuously

“refreshed” to maintain its state as high or low. When you turn

off your computer, everything leaks away and you have to reboot

when you turn the computer back on.) This transistor circuit

element is essentially a switch that is either “on” or “off” (a “1” or

a “0”). We see then that a single bit can only store 2 pieces of

information (1 or 0).

With 2 bits, you have one bit either “1” or “0” and the other bit

either “1” or “0”. These four combinations let you store 2 x 2 = 4

pieces of information. If you have 8 bits, as you might guess, you

can store 2 multiplied by itself 8 times, that is, 2 x 2 x 2 x 2 x 2 x

2 x 2 x 2 = 256 pieces of information. This is enough storage

capacity to store the alphabet, capitals and lower case, the

ten digits 0-9, commas and periods, and all the common symbols

one encounters in ordinary printing of documents. We call these

8 bits a “byte” and the 256 characters are called ASCII (American

Standard Code for Information Interchange) characters. In your

computer, the silicon chip called the central processing unit

(CPU) and probably some other auxiliary chips are programmed

to recognize and process these combinations of “1s” and “0s” in

your computer memory and tell your monitor or printer to display

the proper outputs.

If you program the computer properly, your single byte could also

be made to store the digital equivalents of 256 numbers ranging

from 0 to 255. Therefore, a single byte would be more than

enough to store a 2-digit date corresponding to a year, e.g.,

storing 1979 as 79. But, it could not store 1979 since 1979 is

greater than 256. You can see the number 256 is a sort of

magic number in computer land.

Now we can appreciate my next programming experience. About

20 years went by when, in the late ”70s, I took on the

responsibility of designing and writing the software for a

computer driven test facility for our lithium batteries. To run this

facility, I had a really keen Hewlett Packard 9825 “Programmable

Calculator”. It wasn”t even called a computer but this might have

been a marketing gimmick to gain market penetration; managers

in those days were more likely to approve purchases of

“calculators” than of “computers”. The 9825 did cost about

$6,000. I recall my own management being somewhat reluctant

to go to an automated system for battery testing. Well, that 9825

was blessed with a phenomenal 24 kilobytes of memory, a single

line 32-character LED display (no monitor!), no hard drive, and

two 8-inch floppy disk drives. The latter were housed in separate

“boxes” which together occupied almost the same volume as my

current printer. The computer programming language was “hpl”

(Hewlett Packard Language) and with only 24,000 bytes of RAM

(my computer now contains 64 million bytes), it was essential to

condense every line of software and data storage to a bare

minimum. Of course, if I had entered a date, the year would have

been entered in a single byte, i.e., 79 for 1979.

As part of my program, I kept a running log of the state of each

of the possible 128 lithium batteries being tested. The way the

program was written, I only left room for positive numbers in

certain calculations. To allow for negative numbers, I would have

had to use 128 extra bytes, not insignificant. Now, suppose I was

still running this program and that I had stored a 2-digit year date

in a single byte. At midnight on Jan 1, the year would switch to

2000 and if my program recognized only the 00, my calculation

might involve a 00-99 = -99. Since I couldn”t handle a negative

number, the computer would stop, flashing an error message.

Meanwhile, as I toasted the new millennium (if I weren”t already

asleep!), my batteries would go on charging or discharging with

no computer control. Lithium batteries tend to behave poorly

under such circumstances, exploding or catching fire! Y2K

would be a disaster!

Let”s see how I could have avoided Y2K by dedicating 2 bytes to

storing the year. If my math is correct, two bytes will allow 256 x

256 = 65,536 combinations of “1s” and “0s”. Indeed, with

suitable programming, we can divide the 65,536 by 2 and store

numbers from -32,768 to +32,767 in these two bytes. Now, we

can store 1979 as 1979 and even -1979 (1979 BC) for

archaeological programs! In fact, there”s more than enough

storage space to cover the Y10K problem in the year 10,000! Or

for that matter, the Y20K and the Y30K problems.

Getting back to Y2K, one of the concerns is that there still exist

many programs written in COBOL (COmmon Business Oriented

Language). A couple years ago I was talking to a friend about

Y2K and found out that he had been part of the group that

developed COBOL decades ago. The group was headed by a

remarkable woman, Rear Admiral Grace Hopper, whom some

term the mother of the computer. She was among the first, or

maybe the first, to think computer programs could be written in

English. Her FLOW-MATIC language led to COBOL, which

became adopted universally and billions of lines of COBOL are

present today in business-oriented software. Before her death in

1992, she had received the Distinguished Service Medal, was the

first woman to achieve the rank of rear admiral and had received

over 40 honorary degrees. She didn”t retire from the Navy until

1986 at age 79 and her retirement ceremony was held on board

the U.S.S. Constitution, “Old Ironsides”.

Why was COBOL so successful? One reason is that it was

designed to handle large files of data and move them around

safely. Another reason for COBOL”s widespread use was the fact

that it wasn”t copyrighted and was “universal” in that its use was

not limited to a particular type of machine. The U.S. Navy”s

interest was the need to store service records of each individual

serviceman and provide for the retrieval and movement of the

data when necessary. The types of data handled by COBOL

generally require little or no mathematical treatment, just as most

business records do not. FORTRAN, on the other hand, is suited

more for mathematical manipulations. Of course, COBOL grew

up in the days of limited memory and the years were entered as 2

digits. Not long ago, I met someone recently who is among those

making a very good living going through large banks” and

businesses” COBOL programs and fixing them for Y2K.

Today, programmers can be sloppy and wasteful without the

constraints of limited memory. With millions of bytes available,

they don”t have to worry about a piddling few bytes here and

there. I can”t say for sure but I”d be willing to bet that there are

huge numbers of bytes wasted in all the programs that one installs

on his or her computer. This doesn”t matter for the average

computer customer, who has plenty of memory left over to work

with.

However, there are cases such as laptop computers where

attention to programming efficiency can lead to significant results.

One big limitation with laptop computers is the increasing amount

of power required to run them. Practically, it is highly desirable

not to let the power demands exceed about 20 watts. The

Pentium chip is notable for its running hot due to the power it

consumes. And it”s just one of the power hogs in your computer.

The industry is making a concerted effort to design circuits

efficiently to save electrical power, even as computing power

increases by leaps and bounds. I”ve seen one example where a

teensy one-bit change in a timer register saved one whole watt.

Quite a saving, about 5% of the total power, for a simple

programming change!

Well, would you believe that I”ve just retrieved a copy of the

software that I wrote for my battery test system in a memo from

1981? It turns out that my “real time” clock only returned the

month, day, hour, minute and second when queried and didn”t

worry about what year it was. It was up to me to tell it when a

new year started and I didn”t have to tell it what year it was. I

would have been Y2K compliant after all! At least I think so.

Allen F. Bortrum