Optimization with Python (scipy.optimize)

I am trying to maximize the following function using Python scipy.optimize. However, after many attempts, it does not work. The function and my code are inserted below. Thanks for the help!

Problem

Maximize [sum (x_i / y_i)**gamma]**(1/gamma) subject to the constraint sum x_i = 1; x_i is in the interval (0,1). 

x is a vector of selection variables; y is the vector of parameters; gamma is a parameter. The value of x must be one. And each x should be in the interval (0,1).

the code

 def objective_function(x, y): sum_contributions = 0 gamma = 0.2 for count in xrange(len(x)): sum_contributions += (x[count] / y[count]) ** gamma value = math.pow(sum_contributions, 1 / gamma) return -value cons = ({'type': 'eq', 'fun': lambda x: np.array([sum(x) - 1])}) y = [0.5, 0.3, 0.2] initial_x = [0.2, 0.3, 0.5] opt = minimize(objective_function, initial_x, args=(y,), method='SLSQP', constraints=cons,bounds=[(0, 1)] * len(x)) 
+7
optimization python scipy mathematical-optimization
source share
2 answers

Sometimes the numerical optimizer does not work for some reason. We can parameterize the problem a little differently, and it will work. (and may work faster)

For example, for borders (0,1) we can have a conversion function so that the values ​​in (-inf, +inf) after the conversion are in (0,1)

We can do a similar trick with equality constraints. For example, we can reduce the size from 3 to 2, since the last element in x must be 1-sum(x) .

If it still does not work, we can switch to an optimizer that does not require information from a derivative, such as Nelder Mead .

And there is also a Lagrange multiplier .

 In [111]: def trans_x(x): x1 = x**2/(1+x**2) z = np.hstack((x1, 1-sum(x1))) return z def F(x, y, gamma = 0.2): z = trans_x(x) return -(((z/y)**gamma).sum())**(1./gamma) In [112]: opt = minimize(F, np.array([0., 1.]), args=(np.array(y),), method='Nelder-Mead') opt Out[112]: status: 0 nfev: 96 success: True fun: -265.27701747828007 x: array([ 0.6463264, 0.7094782]) message: 'Optimization terminated successfully.' nit: 52 

Result:

 In [113]: trans_x(opt.x) Out[113]: array([ 0.29465097, 0.33482303, 0.37052601]) 

And we can visualize it using:

 In [114]: x1 = np.linspace(0,1) y1 = np.linspace(0,1) X,Y = np.meshgrid(x1,y1) Z = np.array([F(item, y) for item in np.vstack((X.ravel(), Y.ravel())).T]).reshape((len(x1), -1), order='F') Z = np.fliplr(Z) Z = np.flipud(Z) plt.contourf(X, Y, Z, 50) plt.colorbar() 

enter image description here

+3
source share

Even the tough ones are a bit outdated. I wanted to add an alternative solution that might be useful to others who stumbled upon this issue in the future.

Now our problem is solvable analytically. You can start by writing down the Lagrangian of the optimization problem with constraint (equality):

 L = \sum_i (x_i/y_i)^\gamma - \lambda (\sum x_i - 1) 

The optimal solution can be found by setting the first derivative of this Lagrangian to zero:

 0 = \partial L / \partial x_i = \gamma x_i^{\gamma-1}/\y_i - \lambda => x_i \propto y_i^{\gamma/(\gamma - 1)} 

Using this information, the optimization problem can be solved simply and effectively using:

 In [4]: def analytical(y, gamma=0.2): x = y**(gamma/(gamma-1.0)) x /= np.sum(x) return x xanalytical = analytical(y) xanalytical, objective_function(xanalytical, y) Out [4]: (array([ 0.29466774, 0.33480719, 0.37052507]), -265.27701765929692) 

CT Zhu's solution is elegant, but may violate the positivity constraint on the third coordinate. For gamma = 0.2 this does not seem to be a problem in practice, but for different gamma you easily run into problems:

 In [5]: y = [0.2, 0.1, 0.8] opt = minimize(F, np.array([0., 1.]), args=(np.array(y), 2.0), method='Nelder-Mead') trans_x(opt.x), opt.fun Out [5]: (array([ 1., 1., -1.]), -11.249999999999998) 

For other optimization problems with the same probabilistic simplex constraints as your problem, but for which there is no analytical solution, it might be worth considering projected gradient methods or similar. These methods use the fact that there is a quick algorithm for designing an arbitrary point on this set, see https://en.wikipedia.org/wiki/Simplex#Projection_onto_the_standard_simplex .

(To see the full code and improve the visualization of equations, look at the Jupyter laptop http://nbviewer.jupyter.org/github/andim/pysnippets/blob/master/optimization-simplex-constraints.ipynb )

+1
source share

All Articles