Why doesn't C # require explicit casting to convert Long To Double?

First, sorry for my poor English. I have a piece of code:

long x = 9223372036854775807L; double f = x; Console.WriteLine(x); Console.WriteLine(f); 

Exit:

 9223372036854775807 9,22337203685478E+18 

I get no errors compiling and executing this code. We have a loss of precision when converting Long to Double. Why does C # not require an explicit cast in this case?

Thanks to everyone.

+8
casting c #
source share
3 answers

The language has an implicit conversion built into it.

The following table is provided in the documentation , so you are allowed to assign a value without explicit translation or conversion:

 From To =============================================================================== sbyte short , int, long, float, double, or decimal byte short , ushort, int, uint, long, ulong, float, double, or decimal short int , long, float, double, or decimal ushort int , uint, long, ulong, float, double, or decimal int long , float, double, or decimal uint long , ulong, float, double, or decimal long float , double, or decimal char ushort , int, uint, long, ulong, float, double, or decimal float double ulong float , double, or decimal 

And in the documentation it says (my attention):

Accuracy, but not magnitude, can be lost when switching from int, uint, long or ulong to float and from long or ulong to double .

+12
source share

try it

 long x = 9223372036854775807L; decimal f = x; Console.WriteLine(x); Console.WriteLine(f); 
0
source share

Console.WriteLine has an overloaded server method, you use an overloaded static Console Class method

  • Console.WriteLine(System.Int64 value);
  • Console.WriteLine(Systen.Double value);

It has nothing to do with casting (explicit or implicit)


-5
source share

All Articles