I want alwaysPositive to be assigned a positive number with all possible values ββfor lareValue1 and largeValue2 (they are at least 1).
The following statement causes a buffer overflow:
int alwaysPositive = (largeValue1 + largeValue2) / 2;
I know that I can prevent it by subtracting and adding:
int alwaysPositive = largeValue1 + ((largeValue2 - largeValue1) / 2);
But in other programming languages, I can use unsigned bithift to do the trick:
int alwaysPositive3 = (largeValue1 + largeValue2) >>> 1;
How to do it in C #?
The answers below all solve the problem. There are probably many ways to do this, but they all (including my solutions) have one thing in common: they all look confusing.
source
share