How many years of millisecond timestamps can be represented by 41 bits?

I am watching an Instagram blog post about generating pending identifiers . This blog post describes the creation of 64-bit identifiers. Their mechanism allocates 41 of the 64 bit millisecond timestamps, and they say:

  • 41 bits for time in milliseconds (gives us 41 years of user era identifiers)

It's a typo? I calculated that you can store 69 years of millisecond timestamps in 41 bits. Here's how:

  • Maximum milliseconds stored in 41 bits: (2 ^ 41) -1 = 2199023255551 ms
  • Divided by (1000 * 60 * 60 * 24 * 365) ms / year = 69 years

So where am I wrong?

+5
source share
1 answer

You were not mistaken in the calculation.

(2^41)-1 ms == 2199023255.551 s == 610839.7932086 hr == 25451.65805036 days == 69.6828 Julian years == 69.6843 Gregorian Years 

Which line is closely related to your result ( 69 years ).

However, the website you link to says 41 bits gives them

41 year user era identifiers

The β€œera” in this context probably refers to the start date. Given that this article was published "3 years ago" or in 2012 , we can calculate that their era begins in 2012 + 41 - 69 == 1984 . This date was selected as a link .

+8
source

Source: https://habr.com/ru/post/1215555/


All Articles