How to return the cost, grad as a tuple for the scipy function fmin_cg

How to force scipy fmin_cg use one function that returns cost and gradient as a tuple? The problem with f for cost and fprime for gradient is that I may need to perform the operation twice (very expensive), which calculates the values ​​of grad and cost . In addition, the separation of variables between them can be problematic.

However, in Matlab, fmin_cg works with a single function that returns the cost and gradient as a tuple. I do not understand why scipy fmin_cg cannot provide such convenience.

Thanks in advance...

+7
python scipy fminsearch
source share
1 answer

You can use scipy.optimize.minimize with jac=True . If for some reason this is not an option, you can see how he will cope with this situation :

 class MemoizeJac(object): """ Decorator that caches the value gradient of function each time it is called. """ def __init__(self, fun): self.fun = fun self.jac = None self.x = None def __call__(self, x, *args): self.x = numpy.asarray(x).copy() fg = self.fun(x, *args) self.jac = fg[1] return fg[0] def derivative(self, x, *args): if self.jac is not None and numpy.alltrue(x == self.x): return self.jac else: self(x, *args) return self.jac 

This class wraps a function that returns the value of the function and the gradient, while maintaining a singleton cache and checking if it knows its result. Using:

 fmemo = MemoizeJac(f, fprime) xopt = fmin_cg(fmemo, x0, fmemo.derivative) 

The strange thing about this code is that it is assumed that f always called before fprime (but not every call to f accompanied by a call to fprime ). I'm not sure that scipy.optimize actually guarantees this, but the code can be easily adapted to not make this assumption. Reliable version above ( untested ):

 class MemoizeJac(object): def __init__(self, fun): self.fun = fun self.value, self.jac = None, None self.x = None def _compute(self, x, *args): self.x = numpy.asarray(x).copy() self.value, self.jac = self.fun(x, *args) def __call__(self, x, *args): if self.value is not None and numpy.alltrue(x == self.x): return self.value else: self._compute(x, *args) return self.value def derivative(self, x, *args): if self.jac is not None and numpy.alltrue(x == self.x): return self.jac else: self._compute(x, *args) return self.jac 
+6
source share

All Articles