I'm having a problem with the DecimalConverter and Int32Converter , which seem to return conflicting results, as evidenced by the following simple console program:
using System; using System.ComponentModel; class App { static void Main() { var decConverter = TypeDescriptor.GetConverter(typeof(decimal)); Console.WriteLine("Converter: {0}", decConverter.GetType().FullName); Console.WriteLine("CanConvert from int to decimal: {0}", decConverter.CanConvertFrom(typeof(int))); Console.WriteLine("CanConvert to int from decimal: {0}", decConverter.CanConvertTo(typeof(int))); Console.WriteLine(); var intConverter = TypeDescriptor.GetConverter(typeof(int)); Console.WriteLine("Converter: {0}", intConverter.GetType().FullName); Console.WriteLine("CanConvert from int to decimal: {0}", intConverter.CanConvertTo(typeof(decimal))); Console.WriteLine("CanConvert to int from decimal: {0}", intConverter.CanConvertFrom(typeof(decimal))); } }
The way out of this value is as follows:
Converter: System.ComponentModel.DecimalConverter CanConvert from int to decimal: False CanConvert to int from decimal: True Converter: System.ComponentModel.Int32Converter CanConvert from int to decimal: False CanConvert to int from decimal: False
If I misunderstand TypeConverters, the following value should be true:
TypeDescriptor.GetConverter(typeof(TypeA)).CanConvertFrom(typeof(TypeB))
should give the same result as
TypeDescriptor.GetConverter(typeof(TypeB)).CanConvertTo(typeof(TypeA))
At least in the case of System.Int32 and System.Decimal they do not.
My question is: does anyone know if this is by design? Or are TypeConverters for native types in C # really broken?
rossipedia
source share