How to make large steps execute scipy.optimize functions?

I have a compare_images(k, a, b) function that compares two 2d arrays a and b

Inside funcion, I apply a gaussian_filter with sigma=k to a . My idea is to estimate how much I should smooth image a so that it looks like image b

The problem is that my compare_images function will only return different values ​​if the change in k exceeds 0.5 , and if I do fmin(compare_images, init_guess, (a, b) , it usually binds to the value of init_guess .

I believe that the fmin (and minimize ) minimize tends to start with very small steps, which in my case will play the same return value for compare_images , and therefore the method considers that it has already found the minimum. He will try a couple of times.

Is there a way to force fmin or any other minimization function from scipy to perform larger steps? Or is there any method more suitable for my need?

EDIT: I found a workaround. First, as recommended, I used xtol=0.5 and above as an argument to fmin . Even then, I still had problems, and several times fmin returned init_guess . Then I created a simple loop, so if fmin == init_guess , I would generate another random init_guess and try again.

This is pretty slow of course, but now I got it to run. It will take 20 hours or so to run it for all my data, but I will not need to do it again.

In any case, to better explain the problem to those who are still interested in finding the best solution:

  • I have 2 images, a and b , containing some scientific evidence.
  • a looks like several points with a variable value (this is a matrix in which each significant point represents an event and intensity)
  • b looks like a smoothed heat map (this is the observed density of occurrences)
  • b looks as if you apply a Gaussian filter to a with a small amount of semi-random interference.
  • We approach b by applying a Gaussian filter with the constant sigma to a . This sigma was selected visually, but only works for a specific class of images.
  • I try to get the optimal sigma for each image, so later I was able to find some sigma relationships and the event class shown on each image.

Anyway, thanks for the help!

+6
source share
2 answers

Quick check: did you probably mean fmin(compare_images, init_guess, (a,b)) ?

If gaussian_filter behaves the way you say, your function is piecewise constant, which means that optimizers relying on derivatives (i.e. most of them) do not work. You can try a global optimizer like anneal , or brute force search in a reasonable range of k .

However, as you described the problem, in the general case there will only be a clear global minimum compare_images if b is a smoothed version of a . Your approach makes sense if you want to determine the amount of smoothing a , which makes both images the most similar.

If the question is β€œhow similar are the images”, then I think that pixel comparison (maybe with a bit of anti-aliasing) is the way to go. Depending on which images we are talking about, you may need to first align the images (for example, to compare photos). Please clarify: -)

edit . Another idea that might help: rewrite compare_images so that it computes two versions of the smoothed a - one with sigma = floor(k) and one with ceil(k) (i.e. round k to the next-lower / upper int). Then calculate a_smooth = a_floor*(1-kfrac)+a_ceil*kfrac , with kfrac being the fractional part of k . Thus, the comparison function becomes continuous wrt k .

Good luck

+3
source

a diving jump can improve a little, as he has a high probability of continuing anyway when he is stuck on a plateau.

I found in this example function that it works reasonably well at low temperatures:

 >>> opt.basinhopping(lambda (x,y): int(0.1*x**2 + 0.1*y**2), (5,-5), T=.1) nfev: 409 fun: 0 x: array([ 1.73267813, -2.54527514]) message: ['requested number of basinhopping iterations completed successfully'] njev: 102 nit: 100 
+1
source

All Articles