Using the immediate window to do some debugging, I came across the following, which I simplified for this question:
execution of this command:
?20000*2
creates an overflow error. Suppose this is due to the fact that we did not declare the data type, and VBE suggested that the Integerresult goes beyond the bounds of a signed integer and, therefore, an overflow occurs.
However, if I run:
?39999+1
I get 40000as expected.
Is it because I initially started with Longinstead Integer(i.e. 20,000 vs 39,999)? And why is the memory allocated based on the original input, and not the result of the calculation?