Problem with scipy.optimize.fmin_slsqp when using very large or very small numbers

Has anyone encountered problems with fmin_slsqp (or something else in scipy.optimize) only when using very large or very small numbers?

I am working on some Python code to take a grayscale image and a mask, create a histogram, and then place a histogram with several gaussians. To develop the code, I used a small sample image, and after some work, the code worked brilliantly. However, when I first normalize the histogram by generating bin values ​​<1, or when I histogram huge images by generating bin values ​​in hundreds of thousands, fmin_slsqp () starts sporadically. It finishes work only after ~ 5 iterations, as a rule, it simply returns a slightly modified version of the initial assumption that I gave, and returns output mode 8, which means "positive directional derivative for finding lines." If I check the size of the bin count at the beginning and scale them in a neighborhood of ~ 100-1000,fmin_slsqp () works as usual. I just clean things up before returning the results. I guess I could leave it like that, but it looks like a hack.

I looked around and found that people were talking about the value of epsilon, which is mainly used to approximate derivatives, but the setup did not help. Other than that, I haven't found anything useful yet. Any ideas are greatly appreciated. Thanks in advance.

James

+5
source share
3 answers

optimize.leastsq. , , , , 1-18 .., , . (, , .., , ), lesssq - .

, , ...

- , scipy.optimize? , ... , - OP.

+5

( "x0" ), ? , , . , .

+4

, . , .

, scipy.optimize.fmin_slsqp , jac False . , , ( ). . , Positive directional derivative for linesearch.

You can try to implement the closed form of the Jacobi matrix for the function of the object and pass it to the argument jac. More importantly, you must re-scale the value of the Jacobi matrix (for example, normalization) to avoid affecting the line search.

Best.

+1
source

All Articles