I am a little confused by how long they have been working on C.
If I request the maximum long value in Java, I get the number in quintillion. If I ask for it in C, signed or unsigned, it's in billions.
Java is built on C ... so where is the difference?
I also tried to represent literals with long long values, unsigned / signed long values and long int. None of them seem to handle mid-billion numbers. What for? Am I mistaken?
source
share