Buy, differential evolution

The fact is that I am trying to develop a fitting procedure for my purposes and want to use the Scipy differential evolution algorithm as a general estimate of the initial values, which will then be used in the LM algorithm for a better fit. The function that I want to minimize with DE is the least squares between the analytically determined non-linear function and some experimental values. The point I'm stuck at is the design of the function. As stated in the scipy reference: "the function must be in the form f (x, * args), where x is an argument in the form of an array with 1-D, and args is a tuple of any additional fixed parameters needed to fully indicate the function"

There is an ugly code example that I wrote for illustrative purposes only:

def func(x, *args): """args[0] = x args[1] = y""" result = 0 for i in range(len(args[0][0])): result += (x[0]*(args[0][0][i]**2) + x[1]*(args[0][0][i]) + x[2] - args[0][1][i])**2 return result**0.5 if __name__ == '__main__': bounds = [(1.5, 0.5), (-0.3, 0.3), (0.1, -0.1)] x = [0,1,2,3,4] y = [i**2 for i in x] args = (x, y) result = differential_evolution(func, bounds, args=args) print(func(bounds, args)) 

I wanted to put raw data in the form of a tuple in a function, but it seems that this is not as intended, as the interpreter does not like this function. The problem should be easily solvable, but I'm really disappointed, so the advice would be greatly appreciated.

+5
source share
1 answer

This is a kind of direct solution that shows the idea, also the code isn`t very pythonic, but for simplicity I think this is good enough. As an example, we want to select an equation of the form y = ax ^ 2 + bx + c to the data obtained from the equation y = x ^ 2. Obviously, the parameter a = 1 and b, c should be 0. Since the differential evolution algorithm finds minimum of the function, we want to find the minimum standard deviation (again for simplicity) of the analytical solution of the general equation (y = ax ^ 2 + bx + c) with the given parameters (providing some initial guess) in comparison with the "experimental" data. So, to the code:

 from scipy.optimize import differential_evolution def func(parameters, *data): #we have 3 parameters which will be passed as parameters and #"experimental" x,y which will be passed as data a,b,c = parameters x,y = data result = 0 for i in range(len(x)): result += (a*x[i]**2 + b*x[i]+ c - y[i])**2 return result**0.5 if __name__ == '__main__': #initial guess for variation of parameters # abc bounds = [(1.5, 0.5), (-0.3, 0.3), (0.1, -0.1)] #producing "experimental" data x = [i for i in range(6)] y = [x**2 for x in x] #packing "experimental" data into args args = (x,y) result = differential_evolution(func, bounds, args=args) print(result.x) 
+4
source

All Articles