Submitted by vert1s t3_1057mop in singularity
EbolaFred t1_j3c5g33 wrote
Reply to comment by thetburg in ChatGPT Singularity Joke by vert1s
They are different problems.
Y2K happened because back when memory was expensive, programmers decided to use two digits to encode years to save space. This was OK because most humans normally only use two digits for years. Most smart developers knew it was wrong but figured their code wouldn't be around long enough to cause an eventual problem, so why not save some memory space.
Year 2038 is different. It's due to how Unix stores time using a 32-bit integer, which overflow in 2038.
Most modern OSs and databases have already switched to 64-bit, but, as usual, there's tons of legacy code to deal with. Not to mention embedded systems.
gangstasadvocate t1_j3d1brq wrote
I feel like for binary and bits and bytes 2048 would make more sense, when does 64-bit overflow, 4086?
EbolaFred t1_j3d6pew wrote
Yeah, sorry, you're not thinking of it correctly.
Unix time is the number of seconds since Jan 1, 1970. Which, in 2038, will be 2,147,483,648 seconds. This is the same as what a signed 32-bit integer can hold (10^32-1), hence the problem.
Switching to 64-bit can hold this timekeeping scheme for almost 300 billion years.
Note that this is just how Unix decided to keep time when it was being developed. There are obviously many newer implementations that get much more granular than "seconds since 1970" and last longer.. The problem is that many programs have standardized on how Unix does it, so programs know what to expect when calling time().
gangstasadvocate t1_j3d7jkw wrote
Still don’t completely understand what you mean by the 1032-1 but hell yeah that’s way more like it 300000,000,000 more years we’ll be long gone before then
Viewing a single comment thread. View all comments