Why did Haskell infer a particular type (apparently) inconsistently?

I am in the process of writing a library of vector toys (physics), and I am having problems with the real GHC functions, they must have Integer in their type. I want vectors to multiply vectors as well as scalars (just using * ), and although this was possible just by having a Vector instance of Num , I now get sorting errors:

 Couldn't match expected type `Integer' with actual type `Double' 

After playing with the code to catch the problem, I got this:

 data V a = V aaa deriving (Show, Eq, Functor) scale a (V ijk) = V (a*i) (a*j) (a*k) (<.>) = scale 

Now, if we ask GHCi what type we get:

 >:t scale scale :: Num a => a -> V a -> V a >:t (<.>) (<.>) :: Integer -> V Integer -> V Integer 

Where we, of course, do not want <.> act only on Integer s. Although this can be fixed by including a type signature for <.> , I would like to know what is actually happening.

+7
source share
1 answer

You are faced with the infamous restriction of monomorphism . Another solution would be to explicitly specify the arguments:

 a <.> v = scale av 

Or add the -XNoMonomorphismRestriction pragma.

+15
source

All Articles