OH MY GAWD! NOT AGAIN!

User avatar
keith
Posts: 3765
Joined: Mon Feb 22, 2021 10:23 pm
Location: The Swamp in Victorian Oz
Occupation: Retired Computer Systems Analyst Project Manager Super Coder
Verified: ✅lunatic

OH MY GAWD! NOT AGAIN!

#1

Post by keith »

Get ready for the Y2K+38 BUG!

:nooo:

:panic:

I ain't gonna be around to bail you out if you don't get it fixed in time.

True, my COBOL skills held me in good stead during 1998 and 1999, but if you didn't learn your lesson back then I ain't gonna be no help.

I refuse to learn "C" and don't like Perl or what ever scripting language you are using unless its ORexx and there's an ever increasing chance that I won't be around in 2038 anyway.

You young whippersnappers better get 'er done.
Has everybody heard about the bird?
User avatar
MN-Skeptic
Posts: 3077
Joined: Mon Feb 22, 2021 1:03 pm
Location: Twin Cities

OH MY GAWD! NOT AGAIN!

#2

Post by MN-Skeptic »

xkcd's comic for today is apt -

Image
User avatar
Foggy
Dick Tater
Posts: 9623
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

OH MY GAWD! NOT AGAIN!

#3

Post by Foggy »

Huh. You didn't show your work. :think:

Let's get into the weeds, shall we? I will try to format the maffs correckly for the Wiki.
Many computer systems measure time and date as Unix time, an international standard for digital timekeeping. Unix time is defined as the number of seconds elapsed since 00:00:00 UTC on 1 January 1970 (an arbitrarily chosen time), which has been dubbed the Unix epoch.

So far, so good. The Computery Big Bang happened on New Year's Day, 1970.

I was already 17 years old. I am older than the Unix epoch. But to continue ...
Unix time has historically been encoded as a signed 32-bit integer, a data type composed of 32 binary digits (bits) which represent an integer value, with 'signed' meaning that one bit is reserved to indicate sign (+/–). Thus, a signed 32-bit integer can only represent integer values from −(231) to 231 − 1 inclusive. Consequently, if a signed 32-bit integer is used to store Unix time, the latest time that can be stored is 231 − 1 (2,147,483,647) seconds after epoch, which is 03:14:07 on Tuesday, 19 January 2038. Systems that attempt to increment this value by one more second to 231 seconds after epoch (03:14:08) will suffer integer overflow, inadvertently flipping the sign bit to indicate a negative number. This changes the integer value to −(231), or 231 seconds before epoch rather than after, which systems will interpret as 20:45:52 on Friday, 13 December 1901. From here, systems will continue to count up, towards zero, and then up through the positive integers again. As many computer systems use time computations to run critical functions, the bug may introduce fatal errors.
Fatal, that is. Fatal is a bad thing.
By coincidence, the date to which vulnerable systems will reset is a Friday the 13th.
:blackeyebig:

Oh, wait. I was born on Friday the 13th. :think:

Edit: On January 19th, 2038, I will be only 85. Oh wait, then it will be 1901 again and I get to start over. :biggrin:
🎶 We went for a ride,
We got outside,
The sand was hot,
She wanted to dance ... 🎶
User avatar
bill_g
Posts: 5516
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Checked Republic ✓ ᵛᵉʳᶦᶠᶦᵉᵈ

OH MY GAWD! NOT AGAIN!

#4

Post by bill_g »

Party like it's 1899!
User avatar
Foggy
Dick Tater
Posts: 9623
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

OH MY GAWD! NOT AGAIN!

#5

Post by Foggy »

'Course, initially, 03:14:07 on Tuesday, 19 January 2038, only happens in the UTC time zone ("Zulu" military time) at longitude 0°. If there was anything sane about any of this, the bug would roll around the world, hour by hour, until it was over and we were all dead.

But that's metaphysically absurd, man. ALL the computers use epoch time. So they'll all get fatal simultaneously, even the ones where it's broad daylight! Where it's not even three in the morning! In Oz, even!

Or maybe, I don't know. Some things are not revealed until they're revealed. :shrug:
🎶 We went for a ride,
We got outside,
The sand was hot,
She wanted to dance ... 🎶
User avatar
bill_g
Posts: 5516
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Checked Republic ✓ ᵛᵉʳᶦᶠᶦᵉᵈ

OH MY GAWD! NOT AGAIN!

#6

Post by bill_g »

User avatar
keith
Posts: 3765
Joined: Mon Feb 22, 2021 10:23 pm
Location: The Swamp in Victorian Oz
Occupation: Retired Computer Systems Analyst Project Manager Super Coder
Verified: ✅lunatic

OH MY GAWD! NOT AGAIN!

#7

Post by keith »

MN-Skeptic wrote: Sat Nov 12, 2022 1:41 am xkcd's comic for today is apt -

Image
That was my wake up call, so not entirely coincidence.
Has everybody heard about the bird?
User avatar
noblepa
Posts: 2452
Joined: Mon Feb 22, 2021 2:55 pm
Location: Bay Village, Ohio
Occupation: Retired IT Nerd

OH MY GAWD! NOT AGAIN!

#8

Post by noblepa »

IBM Mainframes (the few that are still around, but run some of the largest banks and airline reservation systems) does not use Unix time.

Windows does not use Unix time.

https://frameboxxindore.com/other/does- ... %2C%202038.

I do not know just how Windows stores time internally. It may have a similar problem.

MacIntosh computers may have a problem. I'm not a Mac guy, but it is my understanding that the recent generations of the MacOS are Linux-based.

Back in 1999, a lot of people were afraid that cars would stop running and airplanes would fall out of the sky at midnight on Dec. 31, 1999.

The Y2K problem was a little bit different than the Y2038 problem.

Y2K was a problem because in the early days (sixties and seventies), disk storage was very expensive, so programmers often coded only two digits for they year. Even in 2001, a human looking at a screen or report, seeing a birthdate of 08/09/51 could easily understand that that meant 1951. The problem came in when you tried to do what I used to call "date arithmetic". That is, adding, subtracting or comparing two dates.

If someone began collecting social security at age 65 in 1990, having been born in 1925, when Jan. 1, 2000 rolled around and the social security administration's computers printed checks, they would subtract 25 from the current date of 00, resulting in the ludicrous (but completely legit to a computer) age of -25, which is obviously less than +65. No check for you!

I was one of the guilty ones. I began programming professionally in 1972, and I remember saying to a co-worker (or as Scott Adams calls them cow-orkers), "I sure hope I'm not still doing this in 2000". Being young and foolish, 28 years in the future seemed an eternity. Alas, I was still working in IT in 2000.

The Y2038 problem is built into the Unix/Linux operating system. I'm sure that many databases that have date and time stamps are coded using Unix time. In Y2K, we had to modify a few programs, mostly Cobol, to use four digit dates and recompile them. In Unix/Linux, you might find that you have to install the change to the OS and have every single program ready to go at the end of the epoch. If the OS's time rolls over from 2**31-1 to 0, and my program wants to calculate someone's age, the OS is going to tell me that the Unix time is a few hundred or a few thousand seconds after 1/1/1970. If someone's birthday is 1,900,000,000 seconds after the begining of the Unix epoch, and the program subtracts 1.9B from, say 5K, it will create truly ludicrous results.

I think that what they are going to have to do, is to create a second Unix date held by the OS. The new one, however would use 64 bits from 1/1/1970. Between now and 2038, an unmodified program, asking for the date, would receive the same 32 bit offset that it does now. That gives programmers 16 years to modify their programs to use the new date.

That is, the OS would maintain two dates, one as a 32 bit offset and the other as a 64 bit offset from 1/1/1970. For the next 16 years, the new date would be identical to the old date, except that it would have 32 leading zero bits. In 2038, the old date would roll over to 0 but the new date would keep on ticking.

I'm sure that 32 bits was selected because, back in 1970, many small computers, which is where Unix started, the standard was a 16 bit word, with an option of a "long integer" of 32 bits. Now, 64 bit words are standard, with some computers using a 128 bit word. Doing date arithmetic on 64 bit integers is a piece of cake.

I'm sure that the young turks at ATT, who first wrote Unix, suffered the same hubris that I did in 1972, so that contributed to the problem, as well.
User avatar
Foggy
Dick Tater
Posts: 9623
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

OH MY GAWD! NOT AGAIN!

#9

Post by Foggy »

The wiki article did mention computers embedded in automobiles, and I wondered which of today's cars will be around in 2038, but that's only 16 years from now, so many will.

Fatal integer overflow might be less of a problem in a car, and I know you saw those movies but your car won't blow up like that.
🎶 We went for a ride,
We got outside,
The sand was hot,
She wanted to dance ... 🎶
User avatar
MN-Skeptic
Posts: 3077
Joined: Mon Feb 22, 2021 1:03 pm
Location: Twin Cities

OH MY GAWD! NOT AGAIN!

#10

Post by MN-Skeptic »

My sister was a carpenter at an airport in 1999. I don't think she had to work on 12/31, but I also think she had to be available in case there were any issues that arose. A carpenter? :shrug:
User avatar
noblepa
Posts: 2452
Joined: Mon Feb 22, 2021 2:55 pm
Location: Bay Village, Ohio
Occupation: Retired IT Nerd

OH MY GAWD! NOT AGAIN!

#11

Post by noblepa »

Foggy wrote: Sat Nov 12, 2022 5:34 pm The wiki article did mention computers embedded in automobiles, and I wondered which of today's cars will be around in 2038, but that's only 16 years from now, so many will.

Fatal integer overflow might be less of a problem in a car, and I know you saw those movies but your car won't blow up like that.
Back in 1999, a lot of people were afraid that cars would stop. They didn't.

The Y2038 thing might be different. With Y2K, as I said in my previous post, the issue only arose when you had to mathmatically manipulate a date or dates. That usually doesn't happen with embedded systems, such as the computers that control our automobiles. The Y2038 issue affects much more of the operation of the computer.

That said, we have a lot more electronics in our cars. Almost all of it uses some sort of computer. So, the potential for error is probably higher. I don't know how many of those embedded systems use Unix or Linux. I also can't say how much dates play a part in the day to day operation of a car.

A thought occurs to me that, in spite of all the dire warnings of impending doom in 1999, it turned out to be largely a big nothingburger. No planes fell out of the sky. No cars stopped working. No medical monitoring equipment killed any patients. No major financial upheavals occurred. The Social Security Admin. got checks out on time. Banks did not fail.

With that in mind, it may be possible that not enough computer experts and software developers will take this seriously and maybe there will be some problems because the people who should have fixed it weren't proactive enough.
User avatar
keith
Posts: 3765
Joined: Mon Feb 22, 2021 10:23 pm
Location: The Swamp in Victorian Oz
Occupation: Retired Computer Systems Analyst Project Manager Super Coder
Verified: ✅lunatic

OH MY GAWD! NOT AGAIN!

#12

Post by keith »

MN-Skeptic wrote: Sat Nov 12, 2022 6:13 pm My sister was a carpenter at an airport in 1999. I don't think she had to work on 12/31, but I also think she had to be available in case there were any issues that arose. A carpenter? :shrug:
Sure. What if they had to deconstruct a the decorative wall surrounding a vending machine with extreme predjudice? Or sumpin.

Better to have and not need than to need and not have.
Has everybody heard about the bird?
User avatar
keith
Posts: 3765
Joined: Mon Feb 22, 2021 10:23 pm
Location: The Swamp in Victorian Oz
Occupation: Retired Computer Systems Analyst Project Manager Super Coder
Verified: ✅lunatic

OH MY GAWD! NOT AGAIN!

#13

Post by keith »

noblepa wrote: Sat Nov 12, 2022 5:23 pm
I was one of the guilty ones. I began programming professionally in 1972, and I remember saying to a co-worker (or as Scott Adams calls them cow-orkers), "I sure hope I'm not still doing this in 2000". Being young and foolish, 28 years in the future seemed an eternity. Alas, I was still working in IT in 2000.
Me too, also.

Right down to the year of the first coding paycheck.

However. I absolutely did not use 2 digit years - I was precocious beyond my experience and I had a boss that actually listened to what I was saying.

One of my first tasks was to convert a ONE digit year to a two digit year, and I insisted on making it a full 4 digit year with space for month and day too.

Here is the story: The City had a 'special assessments' system for things like when a neighbourhood wanted paved sidewalks put in, or stuff like that. The City would pay a portion, and the residents would pay a portion. The residents portion was added to their property tax over a number of years - 10 years to be exact. The system didn't really care WHAT the year was, just how many years the assessment had left to go. The system was originally built on an IBM 1401, and I assume the 'database' was running off tapes.

Except now, there were assessments that were going to run for 15 or 20 years so we needed two digits. And we were running on an IBM 370 with modern disk drives with more than enough space to manage full on dates.

I convinced them to go full date while I was fiddling about. They argued that the system would not likely be around in 2000 so why bother? I responded that it had already survived for 15 years across 2 computer architectures. It was now 1973, if it only survives for another 10 years, then your 20 year assessments will be running to 2003. AND whatever replaces it will require a full date anyway.

So I made it YYYYMMDD so we could actually sort the data in a meaningful way, and that happened to open up a lot more reporting capabilities and customer service info. The project managers went out of their way to thank me for the transaction logs that could now actually be matched to the project fund general ledger. It was also a lot easier for the folks that wrote the CICS transactions to update and inquire. And it was a lot easier for the external auditor. And it was a lot easier when they installed the Database system.
Has everybody heard about the bird?
User avatar
Foggy
Dick Tater
Posts: 9623
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

OH MY GAWD! NOT AGAIN!

#14

Post by Foggy »

I still have a Weekly World News issue from right before Y2K, where they boldly predicted we were all gonna die.

Instead, the Weekly World News died. Except it's online still, today's issue features the discovery of dinosaurs on Mars.

So that's nice.
🎶 We went for a ride,
We got outside,
The sand was hot,
She wanted to dance ... 🎶
User avatar
Volkonski
Posts: 11776
Joined: Mon Feb 22, 2021 11:06 am
Location: Texoma and North Fork of Long Island
Occupation: Retired mechanical engineer
Verified:

OH MY GAWD! NOT AGAIN!

#15

Post by Volkonski »

Computer experts prevented the Y2K meltdown. I am confident that today's computer experts will deal with the 2038 problem.
“If everyone fought for their own convictions there would be no war.” ― Leo Tolstoy, War and Peace
User avatar
Foggy
Dick Tater
Posts: 9623
Joined: Mon Feb 22, 2021 8:45 am
Location: Fogbow HQ
Occupation: Dick Tater/Space Cadet
Verified: as seen on qvc zombie apocalypse

OH MY GAWD! NOT AGAIN!

#16

Post by Foggy »

By 2038, the dinosaurs on Mars will be killed by an asteroid and turned into light crude at an attractive price. With only 1/3 gravity of Earth, my dune buggy is gonna go flying.
🎶 We went for a ride,
We got outside,
The sand was hot,
She wanted to dance ... 🎶
User avatar
bill_g
Posts: 5516
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Checked Republic ✓ ᵛᵉʳᶦᶠᶦᵉᵈ

OH MY GAWD! NOT AGAIN!

#17

Post by bill_g »

I made so much money on the run up to CYA2K. We couldn't keep sat phones of any kind in stock. Every hospital, every bank, all Federal buildings including the reserve bank, every hydroelectric dam on the Columbia, Clackamas, and Sandy Rivers, every coal and gas fired co-gen, and every water department got something.

And on the first business day after the First of the New Year, many most all of them called to have the equipment removed and their money refunded. Nope. Read the contract. I anticipated this. You have a one year binding subscription with a one year guarantee on hardware and workmanship. We shall support it, but we won't refund it. And we can make it go bye-bye on your dime if you desire.
User avatar
Sam the Centipede
Posts: 1899
Joined: Thu Feb 25, 2021 12:19 pm

OH MY GAWD! NOT AGAIN!

#18

Post by Sam the Centipede »

Wikipedia has a page listing many possible (for some value of possible) computer date issues, for various operating and hardware systems. Quite a few! Some from issues such as BCD encoding, some which might occur in 1978 years as there is/was apparently some debate about whether 4000 should be a leap year.

Finding the page is left as an exercise for the interested reader.
User avatar
keith
Posts: 3765
Joined: Mon Feb 22, 2021 10:23 pm
Location: The Swamp in Victorian Oz
Occupation: Retired Computer Systems Analyst Project Manager Super Coder
Verified: ✅lunatic

OH MY GAWD! NOT AGAIN!

#19

Post by keith »

Sam the Centipede wrote: Sun Nov 13, 2022 6:04 pm Wikipedia has a page listing many possible (for some value of possible) computer date issues, for various operating and hardware systems. Quite a few! Some from issues such as BCD encoding, some which might occur in 1978 years as there is/was apparently some debate about whether 4000 should be a leap year.

Finding the page is left as an exercise for the interested reader.
4000 is divisible buy 4 so it should be a leap year, but
4000 iis divisible by 100 so it should not be a leap year, but
4000 is divisible by 1000 so it IS a leap year.

No, ok, wait. 2000 was not a leap year, so... I've forgotten how it works, and I had to teach it to the Germans in Waldorf back in the day.
Has everybody heard about the bird?
User avatar
Sam the Centipede
Posts: 1899
Joined: Thu Feb 25, 2021 12:19 pm

OH MY GAWD! NOT AGAIN!

#20

Post by Sam the Centipede »

The 400 year thing is about a proposal (not adopted) that meaning 4000, 8000, 12000 etc. non-leap would improve the synchrony with the solar cycle. I guess nobody was too excited because making sure we still have a habitable planet two millennia from now is more important than worrying about what day the druids celebrate the solstice.
New Turtle
Posts: 600
Joined: Wed Nov 10, 2021 2:43 pm

OH MY GAWD! NOT AGAIN!

#21

Post by New Turtle »

noblepa wrote: Sat Nov 12, 2022 5:23 pm
...
I think that what they are going to have to do, is to create a second Unix date held by the OS. The new one, however would use 64 bits from 1/1/1970. Between now and 2038, an unmodified program, asking for the date, would receive the same 32 bit offset that it does now. That gives programmers 16 years to modify their programs to use the new date.

That is, the OS would maintain two dates, one as a 32 bit offset and the other as a 64 bit offset from 1/1/1970. For the next 16 years, the new date would be identical to the old date, except that it would have 32 leading zero bits. In 2038, the old date would roll over to 0 but the new date would keep on ticking.

I'm sure that 32 bits was selected because, back in 1970, many small computers, which is where Unix started, the standard was a 16 bit word, with an option of a "long integer" of 32 bits. Now, 64 bit words are standard, with some computers using a 128 bit word. Doing date arithmetic on 64 bit integers is a piece of cake.

I'm sure that the young turks at ATT, who first wrote Unix, suffered the same hubris that I did in 1972, so that contributed to the problem, as well.
...
Half the issue is gonna solve itself with pretty much all new hardware right now being 64-bit architecture. In turn, all the UNIX-like kernels are or will be 64-bit standard. In these kernels, the datatype (usually time_t) for the timestamp is gonna be a 64-bit integer. Software that runs on 32-bit environments can be ported to 64-bit and if it uses that datatype, it will automatically be extended to (what people today call) a long int. There might be problems with code that uses a default integer type (C "int" type) to store the timestamp, if the language standard still says default integer type is 32-bit.
User avatar
noblepa
Posts: 2452
Joined: Mon Feb 22, 2021 2:55 pm
Location: Bay Village, Ohio
Occupation: Retired IT Nerd

OH MY GAWD! NOT AGAIN!

#22

Post by noblepa »

New Turtle wrote: Mon Nov 14, 2022 5:53 am
noblepa wrote: Sat Nov 12, 2022 5:23 pm
...
I think that what they are going to have to do, is to create a second Unix date held by the OS. The new one, however would use 64 bits from 1/1/1970. Between now and 2038, an unmodified program, asking for the date, would receive the same 32 bit offset that it does now. That gives programmers 16 years to modify their programs to use the new date.

That is, the OS would maintain two dates, one as a 32 bit offset and the other as a 64 bit offset from 1/1/1970. For the next 16 years, the new date would be identical to the old date, except that it would have 32 leading zero bits. In 2038, the old date would roll over to 0 but the new date would keep on ticking.

I'm sure that 32 bits was selected because, back in 1970, many small computers, which is where Unix started, the standard was a 16 bit word, with an option of a "long integer" of 32 bits. Now, 64 bit words are standard, with some computers using a 128 bit word. Doing date arithmetic on 64 bit integers is a piece of cake.

I'm sure that the young turks at ATT, who first wrote Unix, suffered the same hubris that I did in 1972, so that contributed to the problem, as well.
...
Half the issue is gonna solve itself with pretty much all new hardware right now being 64-bit architecture. In turn, all the UNIX-like kernels are or will be 64-bit standard. In these kernels, the datatype (usually time_t) for the timestamp is gonna be a 64-bit integer. Software that runs on 32-bit environments can be ported to 64-bit and if it uses that datatype, it will automatically be extended to (what people today call) a long int. There might be problems with code that uses a default integer type (C "int" type) to store the timestamp, if the language standard still says default integer type is 32-bit.
I think that the bigger problem is all the time stamps that are stored in databases and other files in 32 bit form. That was one of the biggest issues in Y2K. How do you find and change all those stored dates?
qbawl
Posts: 743
Joined: Mon Feb 22, 2021 11:05 am

OH MY GAWD! NOT AGAIN!

#23

Post by qbawl »

noblepa wrote: Mon Nov 14, 2022 8:34 am
New Turtle wrote: Mon Nov 14, 2022 5:53 am
noblepa wrote: Sat Nov 12, 2022 5:23 pm
...
I think that what they are going to have to do, is to create a second Unix date held by the OS. The new one, however would use 64 bits from 1/1/1970. Between now and 2038, an unmodified program, asking for the date, would receive the same 32 bit offset that it does now. That gives programmers 16 years to modify their programs to use the new date.

That is, the OS would maintain two dates, one as a 32 bit offset and the other as a 64 bit offset from 1/1/1970. For the next 16 years, the new date would be identical to the old date, except that it would have 32 leading zero bits. In 2038, the old date would roll over to 0 but the new date would keep on ticking.

I'm sure that 32 bits was selected because, back in 1970, many small computers, which is where Unix started, the standard was a 16 bit word, with an option of a "long integer" of 32 bits. Now, 64 bit words are standard, with some computers using a 128 bit word. Doing date arithmetic on 64 bit integers is a piece of cake.

I'm sure that the young turks at ATT, who first wrote Unix, suffered the same hubris that I did in 1972, so that contributed to the problem, as well.
...
Half the issue is gonna solve itself with pretty much all new hardware right now being 64-bit architecture. In turn, all the UNIX-like kernels are or will be 64-bit standard. In these kernels, the datatype (usually time_t) for the timestamp is gonna be a 64-bit integer. Software that runs on 32-bit environments can be ported to 64-bit and if it uses that datatype, it will automatically be extended to (what people today call) a long int. There might be problems with code that uses a default integer type (C "int" type) to store the timestamp, if the language standard still says default integer type is 32-bit.
I think that the bigger problem is all the time stamps that are stored in databases and other files in 32 bit form. That was one of the biggest issues in Y2K. How do you find and change all those stored dates?
I was a mainframe (IBM370) systems programmer working for an insurance / financial institution at Y2K though my title was Senior Technical Analyst (remember this, it becomes important) and as you say a big problem was finding and changing all the 2 character dates both in the progs and the DBs.
Earlier in my career I had been on the other side of the house and written programs in COBOl and 360 Assembler language for many years. A massive effort was initiated to find and fix all the deficient code. (it always amuses me when folks say "Y2K was no big deal" that's only true because a whole bunch of peeps put in long hours working their collective asses off to "make it so") as a sys prog my groups task was less arduous than for those writing the business code, but our group "Tech Group" was assigned to 'Help
those folks' deal with the mess (by this time there were hardly any assm lang programmers over there but still a whole bunch of code.) so the 'Tech Group' ended up putting in more time and effort than any other team as "Helpers". Well Y2K came and went and management was ecstatic. They decided to give a really nice bonus to all involved. But they only wanted the workers and not the bosses to benefit. Solution: the bonus went to anyone with the word programmer in their title!
User avatar
noblepa
Posts: 2452
Joined: Mon Feb 22, 2021 2:55 pm
Location: Bay Village, Ohio
Occupation: Retired IT Nerd

OH MY GAWD! NOT AGAIN!

#24

Post by noblepa »

qbawl wrote: Mon Nov 14, 2022 9:23 am
noblepa wrote: Mon Nov 14, 2022 8:34 am
New Turtle wrote: Mon Nov 14, 2022 5:53 am

Half the issue is gonna solve itself with pretty much all new hardware right now being 64-bit architecture. In turn, all the UNIX-like kernels are or will be 64-bit standard. In these kernels, the datatype (usually time_t) for the timestamp is gonna be a 64-bit integer. Software that runs on 32-bit environments can be ported to 64-bit and if it uses that datatype, it will automatically be extended to (what people today call) a long int. There might be problems with code that uses a default integer type (C "int" type) to store the timestamp, if the language standard still says default integer type is 32-bit.
I think that the bigger problem is all the time stamps that are stored in databases and other files in 32 bit form. That was one of the biggest issues in Y2K. How do you find and change all those stored dates?
I was a mainframe (IBM370) systems programmer working for an insurance / financial institution at Y2K though my title was Senior Technical Analyst (remember this, it becomes important) and as you say a big problem was finding and changing all the 2 character dates both in the progs and the DBs.
Earlier in my career I had been on the other side of the house and written programs in COBOl and 360 Assembler language for many years. A massive effort was initiated to find and fix all the deficient code. (it always amuses me when folks say "Y2K was no big deal" that's only true because a whole bunch of peeps put in long hours working their collective asses off to "make it so") as a sys prog my groups task was less arduous than for those writing the business code, but our group "Tech Group" was assigned to 'Help
those folks' deal with the mess (by this time there were hardly any assm lang programmers over there but still a whole bunch of code.) so the 'Tech Group' ended up putting in more time and effort than any other team as "Helpers". Well Y2K came and went and management was ecstatic. They decided to give a really nice bonus to all involved. But they only wanted the workers and not the bosses to benefit. Solution: the bonus went to anyone with the word programmer in their title!
Your background and Y2K experience sounds a lot like mine. I began my career writing 360 assembler language programs. I still would love to get a job doing that, but nobody is hiring that particular skill anymore.

I never quite got the hang of the BXLE instruction though. (You are a real 360 assembler programmer if you get the reference).
User avatar
bill_g
Posts: 5516
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Checked Republic ✓ ᵛᵉʳᶦᶠᶦᵉᵈ

OH MY GAWD! NOT AGAIN!

#25

Post by bill_g »

My one and only foray into assembly was my parking garage counter based on the Motorola 6801 for my BSEE junior year lab project. Count the cars going in, count the cars going out, post the remaining number of spaces available, or post LOT FULL. It was a top to bottom project with bonus points if you sold it. Which I did. Two of them. Hand built. Nothing like it in the 1978 market. An absolute learning experience, and one that proved to me I did not want to be a programmer. I'd rather be an integrator.
Post Reply

Return to “Computers and Internet”