Are there any pitfalls when using char * to write cross-platform code that has memory access?
UPDATE: For example, should you check before splitting a dereferenced char * into a specific type (for example, int) if the address is aligned to the size of that type? Will some architectures return strange results with constant access?
I am working on a replay memory allocator to better understand how to debug memmory problems. I came to the conclusion that char * is preferable because of the ability to do pointer arithmetic and cast them in void *, is that true? Are the following assumptions acceptable on different common platforms?
sizeof(char) == 1 sizeof(char*) == sizeof(void*) sizeof(char*) == sizeof(size_t)
Ubermongoose
source share