The amount of memory is essentially quantitative types and related optimization methods

Consider the following existential data model:

data Node a = Node a (Map TypeRep AnyNode) data AnyNode = forall a. Show a => AnyNode a 

The rules for standard memory types have been explained previously . Now, what are the rules for existential types like AnyNode ?

Are there any optimization methods, for example. some workarounds using unsafeCoerce to escape an existential declaration? I ask about this because a type like Node will be placed in the cost center with intensive intensive lib memory, so the whole memory trace is why the dirtiest hacks are welcome.

+8
optimization memory haskell
source share
1 answer

The ghc-datasize package can be useful here:

 {-# LANGUAGE RankNTypes, GADTs #-} import GHC.DataSize data Node = forall a. Show a => Node a main = do s <- closureSize $ Node 0 print s -- 24 bytes on my 64-bit system 

So, it seems that Node takes up one extra word compared to a simple unary data constructor, presumably due to the Show dictionary pointer. In addition, I tried to add additional class constraints to Node , and each one occupies one additional space word.

I do not know for sure whether it is possible to brush off the dictionary index under certain circumstances. I think this is not possible if you want to maintain an existential type.

+15
source share

All Articles