32bit systems will stop working. The Unix timestamp, which increases by 1 every second and started the first second of 1970, will reach the max of 32 bit integers. Bad things will follow.
This has already been patched on all 64 bit OSes though - whatever 32 bit systems are still in existence in another 15 years will just roll their dates back 50 years and add another layer of duct tape to their jerry-rigged existence
And now that every time library has been updated, we’re safe until our grandchildren reimplement those bugs in a language that has not yet been invented.
I went to uni in the mid 90s when Y2K prep was all the rage, went back to do another degree 20 years later. It was interesting to see the graffiti in the CS toilets. Two digits up to about 1996, four digits for a decade, then back to two.
2100 and 2400 will be a shitshow
Not as much as 2038
Yeah that’s a different shitshow but agreed it is likely to be worse - like y2k the effects are smeared out before and after the date.
Why?
Because of the Year 2038 problem.
32bit systems will stop working. The Unix timestamp, which increases by 1 every second and started the first second of 1970, will reach the max of 32 bit integers. Bad things will follow.
This has already been patched on all 64 bit OSes though - whatever 32 bit systems are still in existence in another 15 years will just roll their dates back 50 years and add another layer of duct tape to their jerry-rigged existence
2038 will certainly be a shit show
Yeah but I’ll be dead so not my problem lmao
Nah.
Same thing happened in 2000 and it was a mouse’s fart.
Because of months of preparation. I know, I was doing it.
And now that every time library has been updated, we’re safe until our grandchildren reimplement those bugs in a language that has not yet been invented.
I’ve already seen reimplementation of 2 digit dates here and there.
LOL fuck those guys.
Fortunately I will not be involved. Hopefully I can make something from 2038 though.
You’re not the only one forseeing a nice consultant payday there.
I went to uni in the mid 90s when Y2K prep was all the rage, went back to do another degree 20 years later. It was interesting to see the graffiti in the CS toilets. Two digits up to about 1996, four digits for a decade, then back to two.
Why
2100 not a leap year (divisible by 100). 2400 is a leap year (divisible by 400). Developing for dates is a minefield.
Now imagine working on non Georgian, and the year is 2060
Because they’re not leap years but are
0 === year % 4
Luckily, none of us will be there.
Won’t the computer’s clock reset every time you go to sleep and stop cranking the power generator?
Yeah who knows if our computers are sticks by either date