Check this answer here
The problem is that you are using const.
At run time, when there is a constant, the behavior happens just like with literals, or as if you just encoded these numbers in the code, so since the numbers are 1 and 2, it goes to Uint32, since 1 is inside the uint32 range. Then, when you try to subtract 1 - 2 with uint32, it overflows as 1u - 2u = +4,294,967,295 (0xFFFFFFFF).
The compiler is allowed to view the letters and interpret them differently than other variables. Since const will never change, he can make guarantees that he otherwise could not have made. in this case, it can guarantee that 1 is within the uint range, so it can use it implicitly. Under normal conditions (without const), he cannot guarantee this guarantee,
the signed int changes from -2,147,483,648 (0x80000000) to +2,147,483,647 (0x7FFFFFFF).
unsigned int changes from 0 (0x00000000) to +4 294 967 295 (0xFFFFFFFF).
The moral of the story, be careful when mixing const and var, you can get what you do not expect.
Jared wadsworth
source share