Scala type inference for both generic type and type parameter - why doesn't it work?

If I called one of the most annoying in scala, this would be for the following code:

trait G[+T]
class H[+T] extends G[T]

def f[A<:G[X], X<:Int](g :A)

val g :H[Int]
f(g)

the compiler infers the types of the last call f [H [Int], Nothing] and complains before me for its own stupidity.

Knowing scala, however, he really knows better than me. What is the reason for this? Since both G and H are covariant with respect to T, S <: G[X] with H[_] <=> S<: H[X]for any type S. This one drawback made me construct everything around, avoiding the need to explicitly specify types - it may look like nothing here, but when the names become “real”, the length and almost any method are common and often working on two generic types, it turns out that most of the code is type declarations.

EDIT: The one described above was resolved below by Noah, but what when the derived class is not the same as the base class as shown below?

trait G[+X]
class H[+X, Y] extends G[X]
class F extends G[Int]
def f[A<:G[X], X<:Int](g :A) = g

val h: H[Int, String] = ???
val g :F = ???
f(g)
f(h)
+4
source share
1 answer

If you do Atake the paramater type A[_], I think you can get the Scala compiler to agree with you, and not just do everything Nothing:

def f[A[_] <: G[_], X <: Int](g: A[X]) 

As a note, I usually look in the source scalazwhenever I have a type problem, since they usually came across it and solve it as best as possible.

UPDATE

The method above still works with additional restrictions:

  trait G[+X]

  class H[+X, Y] extends G[X]

  class F extends G[Int]

  class I extends G[String]

  def f[A[_] <: G[_], X <: Int](g: A[X]) = g

  val h: H[Int, String] = new H[Int, String]
  val g: F = new F
  val i:I = new I
  f(g) //works
  f(h) //works
  f(i) // should fail and does fail
+3
source

All Articles