64-bit detection in C using size_t

Is sizeof(size_t)==8 equivalent to being 64-bit platform? Conversely, sizeof(size_t)==4 equivalent to having a 32-bit platform?

More importantly, is this test safe and reliable under any circumstances, bearing in mind the portability of the OS and compilers? Are there any weird corner cases, including potential situations where size_t might be missing?

I'm a little worried that size_t can only be guaranteed for C99 environments.

+8
c portability 64bit
source share
6 answers

size_t is a data type that can represent the size of any object.

64-bit usually refers to the fact that 64-bits are available for accessing virtual memory. In C, memory is addressed using pointers. Thus, sizeof(void*) seems more appropriate for testing a 64-bit environment.

However, this is not guaranteed by standard C. There may be unclear cases where a safe and reliable way of defining hardware architecture using C does not exist.

Since sizeof returns the size as a multiple of the size of char , you can look at CHAR_BIT (defined in limits.h) to find out how many bits there are in char.

+5
source share

In fact, yes, it is safe and reliable. The platforms that you are likely to target or will ever be placed in the future are all byte addresses, with 8-bit bytes and size_t equal to the length of the machine word. Most platforms guarantee that this will continue indefinitely (POSIX guarantees this, for example).

Theoretically, no, it is not safe and reliable. Faulty systems like Cray-1, PDP-10, and various DSP systems will touch you. However, keep in mind the following: what are the odds that you ever created software for Cray-1 that was outdated before the junior engineer sitting next to you was born?

+8
source share

More importantly, is this test safe and reliable under any circumstances, bearing in mind the portability of the OS and compilers?

There is no “portable way” for this, because the C standard allows the environment to define SIZE_MAX as much as it wants (if it is greater than 65535 ). But standard C does not define what the 32-bit and 64-bit platforms are.

However, on shared memory models, size_t is 32 bits on 32-bit platforms and 64 bits on platforms with 64 bits.

I'm a little worried that size_t can only be guaranteed for C99 environments.

size_t also on C89. So, as long as your environment is standard, it should define size_t .

+3
source share

The real question is: what are you going to do if none of sizeof(size_t) == 8 and sizeof(size_t) == 4 is true. You can also make sure that CHAR_BIT == 8 - this avoids confusion if the platform is something other than “commmon, modern hardware”, for example. Cray-1, PDP-10, mentioned elsewhere.

At some point in the future, I predict that 128-bit machines will exist, but I also think that this is far into the future, because at present we can only use 75% of the available bits in 64-bit machines for memory addresses ( size_t refers to memory addressing). Note that the top 16 bits give us 65535 times more memory than the current limit, so we have a long way to go before the memory runs out. [Do not pay attention to the current cost of assembling a machine with 256 TB of memory, which is the current limit - most systems do not even come close to what is in DISKSPACE, it does not matter RAM (a rough calculation in my head, but 42 bit is 4 TB, so 48 bits should be 32 times larger, I think)].

0
source share

CHAR_BIT has already been mentioned, and there is a theoretical case that size_t does not use an extra padding for the value. However, you can safely and carry the exact number of corresponding value bits for an unsigned integer type at runtime.

 size_t s; int num_of_bits; for (s=1, num_of_bits=0; s!=0; s<<=1, num_of_bits++); 

If you need this information at compile time, it might be a good idea to set the #define SIZEOF_BITS 32 value to the value for the platform and check in your unit tests that it matches the actual num_of_bits. You have unit tests, right?

0
source share

To check the pointer, this is true, but not at run time! Say if you distribute the source code, it works. However, if you distribute the binary, it is not reliable. The value is determined at compile time.

-one
source share

All Articles