How can we quantitatively approximate the Jacobian and Hessian functions?

I have a function in Python:

def f(x): return x[0]**3 + x[1]**2 + 7 # Actually more than this. # No analytical expression 

This is a scalar-valued function of the vector.

How can I approximate the Jacobian and Hessian of this function in numpy or scipy numerically?

+7
source share
2 answers

(Updated at the end of 2017, because there were a lot of updates in this space.)

Best of all, most likely, automatic differentiation . There are currently many packages available for this, as this is a standard deep learning approach:

  • Autograd works transparently with most numpy code. It is pure-Python, almost does not require code changes for typical functions, and fast enough.
  • There are many deep learning oriented libraries that can do this. Some of the most popular are TensorFlow , PyTorch , Theano , Chainer , and MXNet . Each of them will require you to rewrite your function in its various null-null-but-unnecessary APIs, and in return provide you with GPU support and a bunch of deep learning functions that you may or may not need.
  • FuncDesigner is an older package that I have not used, whose website is currently unavailable.

Another option is to approximate it using finite differences , basically just evaluating (f(x + eps) - f(x - eps)) / (2 * eps) (but obviously with more effort into it than). This is likely to be slower and less accurate than other approaches, especially in moderately high dimensions, but is completely general and does not require code changes. numdifftools seems to be the standard Python package for this.

You can also try to find fully symbolic derivatives with SymPy , but this will be a relatively manual process.

+11
source

Limited only to SciPy, the most convenient way I found is scipy.misc.derivative , within the respective loops, with lambdas to execute the function of interest.

0
source

All Articles