LP64, LLP64 and IL32 transition

During the transition from 16 to 32 bits in the 80s, int was either 16 or 32 bits. Using the current nomenclature of the 64-bit nomenclature, I understand that there was a fairly common distribution of ILP32 and LP32 machines. At the time, I thought it was clear that int would always follow registers or pointer widths for any given architecture, and that long would remain 32 bits.

Fast forward 25 years, I see that LP64 is quite common, but until I came across 64-bit platforms [my discovery of desktop Linux in 2007 :)], I always expected IP64 to be the next logical step.

  • Was this (LP64) the expected evolution for 64-bit?
    • How does the char <= short <= int <= long relation fit into this new integer-type commit scheme for each platform that we leave behind?
    • How do these transition schemes relate to using (your choice of {l,u}case ) WORD / DWORD across platforms?
    • In some areas of Windows, there are still int forms that are 16 bits. Will Windows grow out of LLP64 or is it too late?
    • Why was int chosen remaining during this time, and not during the 32-bit transition?
+7
c architecture history platform
source share
2 answers

As I can see, Windows is a weird game for the entire x64 transition. But putting it aside, C or C ++ never defined fixed lengths of integral types. I believe that everything int / long / pointer is understandable if you look at it this way:

int: basically 32 bits long (Linux, Mac and Windows) long: 64 bits on Mac and Linux, 32 on Windows long long: 64 bits on Mac, Linux and Windows x64 (u) intptr_t: exact pointer length (32 on 32-bit, 64 on 64-bit systems)

I use only char in the context of strings and never use short ones, as long as int is on most desktop systems anyway.

WORD and DWORD are ugly and should be avoided. If the API forces you to use them, replace DWORD with DWORD_PTR when you are dealing with ... well, pointers. It was never right to use (D) WORD there, primarily IMHO.

I do not think that Windows will change this decision, ever. Too many problems already

Why is everything else left? Why does Venus rotate in the opposite direction? The answer to the first question is found here (I think), the second is a bit more complicated;)

+5
source share

Instead of looking at it as an “left behind” int , I would say that you should look at it from the point of view that you cannot leave any type of size that you might need. I assume that compilers could define int32_t in terms of some internal extension type __int32_t , but with C99, which has not yet been widely adopted, it would be a big problem when applications had to work with missing int32_t definitions, when their build systems could not find 32 -digit type among standard types. And have a 32 bit type significant, no matter what size your native word is (for example, this is the only valid type for Unicode code values).

For the same reason, it would be impossible to make short 32-bit and int 64-bit: the 16-bit type is necessary for many things, the first processing that comes to mind is (Not to mention the Windows / Java ugly UTF-16 obsession ..)

Indeed, I do not think that the transition is from 16 to 32 bits and 32 to 64-bit. Having left 16-bit, he left a system in which most of the numbers found in ordinary everyday life would not correspond to the basic type and where hacks, such as "distant" pointers, should have been used to work with non-trivial data sets. On the other hand, most applications have a minimal need for 64-bit types. Large monetary indicators, sizes / offsets of multimedia files, disk space, high-quality databases, access to large files associated with memory, etc. are some specialized applications that come to mind, but there is no reason to think that a word processor will ever need billions of characters or that a web page will ever need billions of html elements. There are simply fundamental differences in the ratio of numerical quantities to the realities of the physical world, the human mind, etc.

+3
source share

All Articles