Why does C # treat 0 as an int, and not as short / decimal, etc.?

In C #, when I say:

var s = 0;

what should be like s? he does it int32. it seems to be varinapplicable to type types shortand others (?).

+5
source share
4 answers

According to the C # language standard, in particular §2.4.4.2 for whole literals :

The type of an integer literal is defined as follows:

  • If the literal suffix has not, it is first of these types, which can be represented by its value: int, uint, long, ulong.
  • U U, , : uint, ulong.
  • L L, , : long, ulong.
  • UL, UL, UL, UL, LU, LU, LU LU, ulong.

- , . 0 int ( BCL, Int32), , .

, , . , 0u uint. , ; : (short)0 short.

+5

StackOverflow.

.

var i = 0;
var d = 0d;
var f = 0f;
var l = 0L;
var m = 0m;
+5

(. 2.4.4.2):

, , : int, uint, long, ulong.

, . :

var s = (short)0;
+3

# Int32. short byte, , .

If you want the compiler to conclude decimal, you can:

var s = 0M;
+2
source

All Articles