How such generic abuse is not ambiguous / causes a compiler error

I felt my way around the C # compiler with its limits of "inherited instances of generated classes".

Anyway, this is my test case:

class Program { static void Main(string[] args) { var x = new InClass(); Console.WriteLine(x.Test(10)); //prints foo Console.ReadLine(); } } class BaseClass<Foo, Bar> { public virtual Foo Test(Bar b) { return default(Foo); } public virtual string Test(int b) { return "foo"; ; } } class InClass : BaseClass<string, int> { /*public override string Test(int b) { return "bar"; }*/ } 

I would think that this InClass declaration would cause a compiler error, as it makes Test ambiguous. It also makes it impossible to call non-generic Test within InClass . Notice that InClass code also has code. If I uncomment this code, I get a compiler error.

Is there any mention of this behavior in the C # spec, or is this an unheard of case with an edge?

+4
source share
3 answers

I would think that this InClass declaration would cause a compiler error, as this makes Test ambiguous.

Nope. The specification calls this explicitly in section 7.5.3.6:

Although declared signatures must be unique, it is possible that replacing type arguments can result in identical signatures. In such cases, break rules during overload above will select the most specific element.

The following examples show the allowable and invalid overloads in accordance with this rule.

(Examples follow, obviously.)

So, the language developers considered this, but the alternatives would seem to be worse. (It would be inconvenient not to create a class like InClass , even if you would not want to call Test , for example.)

+7
source

See this question:

Why there are no generic type inheritance / hierarchical restrictions

Eric Lippert carefully studies the potential consequences.

Inherited generic types are not used by the compiler, because they, according to the developers, believed that this rabbit hole should not be reduced.

+1
source

I assume that a method with an (int x) signature is better suited for an int argument than one with the general type Bar , which in this particular case is int .

0
source

All Articles