Is the subtraction behavior of two NULL pointers defined?

Is the difference of two non-pointer pointer variables (in C99 and / or C ++ 98) if they are both NULL value?

For example, let's say I have a buffer structure that looks like this:

 struct buf { char *buf; char *pwrite; char *pread; } ex; 

Let's say ex.buf points to an array or some malloc'ed memory. If my code always guarantees that pwrite and pread point inside this array or one after it, I am sure that ex.pwrite - ex.pread will always be defined. However, if pwrite and pread are zero. Can I just expect the subtraction of two to be defined as (ptrdiff_t)0 or should strictly compatible code check for pointers to NULL? Note that the only case that interests me is when both pointers are NULL (which is an uninitialized buffer). The reason is related to a fully compatible “available” function if the previous assumptions are fulfilled:

 size_t buf_avail(const struct s_buf *b) { return b->pwrite - b->pread; } 
+73
c ++ c c99 c89
Nov 14 '11 at 21:11
source share
4 answers

In C99, this is technically undefined behavior. C99 §6.5.6 states:

7) For the purposes of these operators, a pointer to an object that is not an element of the array behaves the same as a pointer to the first element of the array, one the length of the object type as its element type.

[...]

9) When two pointers are subtracted, both must point to the elements of the same array object, or one after the last element of the array object; as a result, there is a difference in the indices of the two elements of the array. [...]

And in §6.3.2.3 / 3 it says:

An integer constant expression with a value of 0 or such an expression cast for type void * is called a null pointer constant. 55) If the null pointer constant is converted to a pointer type, the resulting pointer, called the null pointer, is guaranteed to be compared unevenly with a pointer to any object or function.

Since the null pointer does not match any object, it violates the prerequisites 6.5.6 / 9, so this behavior is undefined. But in practical terms, I bet that almost every compiler will return 0 without any side effects.

In C89, this is also undefined behavior, although the wording of the standard is slightly different.

C ++ 03, on the other hand, has certain behavior in this case. The standard makes a special exception for subtracting two null pointers. C ++ 03 §5.7 / 7 states:

If a value of 0 is added or subtracted from the value of the pointer, the result is compared with the original value of the pointer. If two pointers point to the same object or both points one after the end of the same array, or both are zero, and two pointers are subtracted, the result is compared with the value 0, converted to type ptrdiff_t .

C ++ 11 (as well as the latest draft C ++ 14, n3690) have the same wording for C ++ 03, with a slight change to std::ptrdiff_t instead of ptrdiff_t .

+93
Nov 14 2018-11-21 at
source share

I found this in the C ++ standard (5.7 [expr.add] / 7):

If two pointers [...] are equal to zero, and two pointers are subtracted, the result is compared with the value 0 converted to Type std :: ptrdiff_t

As others have said, C99 requires adding / subtracting between two pointers of the same array object. NULL does not indicate a valid object, so you cannot use it in subtraction.

+35
Nov 14 '11 at 21:22
source share

Edit: this answer is only valid for C, I did not see the C ++ tag when I answered.

No, pointer arithmetic is allowed only for pointers pointing inside the same object. Since, by definition of standard null pointers, C does not point to any object, this behavior is undefined.

(Although, I would suggest that any reasonable compiler will return only 0 on it, but who knows.)

+23
Nov 14 '11 at
source share

Standard C does not impose any requirements on behavior, however, many implementations specify pointer arithmetic behavior in many cases beyond the minimum requirements required by the Standard, including the behavior of subtracting one null pointer from another. There are very few architectures where there should always be no reason for such subtraction to do anything other than zero zero, but, unfortunately, it is now fashionable for compilers - in the name of "optimization" - requires programmers to manually write code to process corner cabinets previously processed by platforms. For example, if the code that should output the characters n starting at address p is written as:

 void out_characters(unsigned char *p, int n) { unsigned char *end = p+n; while(p < end) out_byte(*p++); } 

old compilers will generate code that reliably outputs nothing, with no side effect if p == NULL and n == 0, without the need for a special case n == 0. However, new compilers would need to add an additional the code:

 void out_characters(unsigned char *p, int n) { if (n) { unsigned char *end = p+n; while(p < end) out_byte(*p++); } } 

which the optimizer may or may not be able to get rid of. Failure to include additional code may cause some compilers to understand that since p "cannot be empty", any subsequent null pointer checks may be omitted, which leads to the code breaking into a spot that is not related to the real one. "problem".

+1
Apr 12 '16 at 7:25
source share



All Articles