Why / how can NSUInteger return a NEGATIVE number?

What is the point of having a separate unsigned type, aka NSUInteger , if there is no guarantee (and it even seems likely) that you can put your lower dollar on, or cry yourself to sleep, as the name implies, is inherently non-negative result.

 NSUInteger normal = 5; NSUInteger freaky = normal - 55; NSLog(@"%ld, %ld", normal, freaky); 

NSLOG 5, -50

Of course, I can lean back trying to get zero or some normalized value ...

 NSUInteger nonNeg = (((normal - 55) >= 0) ? (normal - 55) : 0); 

PARRALELUNIVERSELOG 5, -50

But here the compiler complains .. by right, that the comparison of unsigned expression >= 0 is always true - and here it is, an answer that I did not want / expected. Someone pats my face, drinks, tells me what year it is ... or better ... how to do it - you know - don’t do it.

+4
source share
2 answers

%ld tells NSLog print it as a signed integer. Try %lu .

See 2 Complement on wikipedia for an explanation of what happens at the bit level.

What happens here is that subtraction causes an unsigned integer representation. To protect against this, you need to check before performing the subtraction.

 NSUInteger x = 5; NSUInteger y = 55; // If 0 makes sense in your case NSUInteger result = (x >= y) ? (x - y) : 0; // If it should be an error if(x < y) { // Report error } 
+7
source

The real answer is how the bits are interpreted. Undoubtedly, it is important to understand how everything is done internally through 2 additions, but the confusion seems to be more that you see a negative number when using% ld and think that this is somehow different from the positive number that you see with using% l Both cases use the same bits, this is just a function of how they are interpreted. You would get different results if you tried to interpret the same bits as a sequence of characters.

0
source

All Articles