A change in priority that seemed like optimization turned out to be pessimization in Haskell

I wrote two functions to check if a number is prime in Haskell:

prime :: Int -> Bool prime 0 = False prime 1 = False prime 2 = True prime n | even n = False prime n = all (\x -> n `rem` x /= 0) [3,5..intSqrt] where intSqrt = (floor . sqrt . fromIntegral) n prime2 :: Int -> Bool prime2 0 = False prime2 1 = False prime2 n = all (\x -> n `rem` x /= 0) [2..intSqrt] where intSqrt = (floor . sqrt . fromIntegral) n 

The first version should, on average, do half the calculations of the second, because even numbers are missing, but it turns out that the second version, which seems slower, is almost 2 times faster! I enable terminal session timings.

Prime 1 version:

 $ ghc -O2 prime.hs [1 of 1] Compiling Main ( prime.hs, prime.o ) Linking prime ... $ time ./prime 142913828922 real 0m4.617s user 0m4.572s sys 0m0.040s 

Now I am using a program change to use the prime2 version:

 $ ghc -O2 prime.hs [1 of 1] Compiling Main ( prime.hs, prime.o ) Linking prime ... $ time ./prime 142913828922 real 0m2.288s user 0m2.268s sys 0m0.020s $ 

The code in main simple:

 main :: IO() main = print $ sum $ filter prime2 [1..2000000] 

Why is the second version faster if it is twice the number of modolus?

+5
source share
1 answer

As Daniel noted in the comments, even n will perform the mod operation, as in the second version. In addition, this was the first case that got into the second version. Thus, the second version has fewer case branches, but with the same efficiency. Remember that Haskell is a non-strict language, so all other mod operations will not be forced if the first already returns True .

0
source

All Articles