Comments

You must log in or register to comment.

Sashinii t1_j39a05c wrote

I laughed after the joke was explained.

27

heyimpro t1_j3afp5c wrote

Is that bit true about the 32 bit integer running out on 1-19?

Interesting that it’s calculated to be around the sameish average year that most people are agreeing on that will be the singularity.

6

AndromedaAnimated t1_j3atztv wrote

I think the first joke was more witty, even though the second one is not bad.

1

footurist t1_j3b3mkw wrote

Would have been cool if it had actually continued the calculation beyond the overflow and printed the past date. But the math part would probably be a problem.

4

Ortus14 t1_j3bcp55 wrote

I laughed at the first joke. Ai's being afraid of newer Ai's isn't something I normally think about, although it is sometimes in science fiction.

3

vert1s OP t1_j3c41r6 wrote

They are different though similar problems. Y2K had more to do with entering dates as two digits. Where as the 2038 problem has more to do with the space that an epoch date takes within data storage. Particularly in strictly typed languages (e.g. C/C++).

Since date functions are usually libraries or built into languages newer versions almost always take this into account, since the problem has been know about for a while. Like Y2K the question becomes what legacy software (and in some cases hardware) is around that will end up breaking.

2

EbolaFred t1_j3c5g33 wrote

They are different problems.

Y2K happened because back when memory was expensive, programmers decided to use two digits to encode years to save space. This was OK because most humans normally only use two digits for years. Most smart developers knew it was wrong but figured their code wouldn't be around long enough to cause an eventual problem, so why not save some memory space.

Year 2038 is different. It's due to how Unix stores time using a 32-bit integer, which overflow in 2038.

Most modern OSs and databases have already switched to 64-bit, but, as usual, there's tons of legacy code to deal with. Not to mention embedded systems.

6

EbolaFred t1_j3d6pew wrote

Yeah, sorry, you're not thinking of it correctly.

Unix time is the number of seconds since Jan 1, 1970. Which, in 2038, will be 2,147,483,648 seconds. This is the same as what a signed 32-bit integer can hold (10^32-1), hence the problem.

Switching to 64-bit can hold this timekeeping scheme for almost 300 billion years.

Note that this is just how Unix decided to keep time when it was being developed. There are obviously many newer implementations that get much more granular than "seconds since 1970" and last longer.. The problem is that many programs have standardized on how Unix does it, so programs know what to expect when calling time().

2