Haskell Ad Package

I want to use the automatic ad differentiation package to study the weight of neural networks in Haskell. I found some functions that can only have what I need, however, I cannot understand what they expect as the first parameter. It should be a function for optimization, but I don’t know which form. They have these signatures:

gradientDescent :: (Traversable f, Fractional a, Ord a) => (forall s. Mode s => f (AD sa) -> AD sa) -> fa -> [fa] 

I found that forall s. means something called an existential quantizer, but nothing more. My question is, how can I pass my cost function with a signature like cost :: [Double] -> Double (it accepts a list of weights) to this library?

+4
source share
1 answer

So, the first argument is a function on any one passing through AD to one AD . For a passing one, we can replace something like a list to begin with. This function must be polymorphic in mode. Therefore, let it ignore it and simply do not do what the regime determines! This feature is obviously what we are optimizing. The next argument is the initial value that we pass. We will also call it a list. And the result is a list of steadily more optimized choices to improve guesswork for our purpose.

Note that AD sa is an instance of Num and Fractional for all s modes if a is equal to Num and Fractional . Therefore, just write a polymorphic function from a list of integers into a single integer, go to the initial state, and the function you provide will optimize it for you.

those. do not indicate your cost function as more double, but indicate it as polymorphic over any Num and Fractional , and let the library take care of the rest!

You may prefer to get used to this style by first trying out other, more basic functions, such as diff .

+4
source

All Articles