• SHITPOSTING_ACCOUNT@feddit.de
      link
      fedilink
      arrow-up
      25
      ·
      1 year ago

      I get the joke, but for those seriously wondering:

      The epoch is Jan 1, 1970. Time uses a signed integer, so you can express up to 2^31 seconds with 32 bits or 2^63 with 64 bits.

      A normal year has exactly 31536000 seconds (even if it is a leap second year, as those are ignored for Unix time). 97 out of 400 years are leap years, adding an average of 0.2425 days or 20952 seconds per year, for an average of 31556952 seconds.

      That gives slightly over 68 years for 32 bit time, putting us at 1970+68 = 2038. For 64 bit time, it’s 292,277,024,627 years. However, some 64 bit time formats use milliseconds, microseconds, 100 nanosecond units, or nanoseconds, giving us “only” about 292 million years, 292,277 years, 29,228 years, or 292 years. Assuming they use the same epoch, nano-time 64 bit time values will become a problem some time in 2262. Even if they use 1900, an end date in 2192 makes them a bad retirement plan for anyone currently alive.

      Most importantly though, these representations are reasonably rare, so I’d expect this to be a much smaller issue, even if we haven’t managed to replace ourselves by AI by then.

      • SCB@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        1 year ago

        an end date in 2192 makes them a bad retirement plan for anyone currently alive.

        I can’t wait to retire when I’m 208 years old.

        • SCB@lemmy.world
          link
          fedilink
          arrow-up
          12
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Butlarian crusade

          Butlerian Jihad, my dude. Hate to correct you, but the spice must flow.