Nonlinear regression e ^ (- x) using scipy, python, numpy

The following code gives me a flat line for the line of best fit, and not a good curve along the e ^ (- x) model that matches the data. Can someone show me how to fix the code below so that it matches my data?

import numpy as np import matplotlib.pyplot as plt import scipy.optimize def _eNegX_(p,x): x0,y0,c,k=py = (c * np.exp(-k*(x-x0))) + y0 return y def _eNegX_residuals(p,x,y): return y - _eNegX_(p,x) def Get_eNegX_Coefficients(x,y): print 'x is: ',x print 'y is: ',y # Calculate p_guess for the vectors x,y. Note that p_guess is the # starting estimate for the minimization. p_guess=(np.median(x),np.min(y),np.max(y),.01) # Calls the leastsq() function, which calls the residuals function with an initial # guess for the parameters and with the x and y vectors. Note that the residuals # function also calls the _eNegX_ function. This will return the parameters p that # minimize the least squares error of the _eNegX_ function with respect to the original # x and y coordinate vectors that are sent to it. p, cov, infodict, mesg, ier = scipy.optimize.leastsq( _eNegX_residuals,p_guess,args=(x,y),full_output=1,warning=True) # Define the optimal values for each element of p that were returned by the leastsq() function. x0,y0,c,k=p print('''Reference data:\ x0 = {x0} y0 = {y0} c = {c} k = {k} '''.format(x0=x0,y0=y0,c=c,k=k)) print 'x.min() is: ',x.min() print 'x.max() is: ',x.max() # Create a numpy array of x-values numPoints = np.floor((x.max()-x.min())*100) xp = np.linspace(x.min(), x.max(), numPoints) print 'numPoints is: ',numPoints print 'xp is: ',xp print 'p is: ',p pxp=_eNegX_(p,xp) print 'pxp is: ',pxp # Plot the results plt.plot(x, y, '>', xp, pxp, 'g-') plt.xlabel('BPM%Rest') plt.ylabel('LVET/BPM',rotation='vertical') plt.xlim(0,3) plt.ylim(0,4) plt.grid(True) plt.show() return p # Declare raw data for use in creating regression equation x = np.array([1,1.425,1.736,2.178,2.518],dtype='float') y = np.array([3.489,2.256,1.640,1.043,0.853],dtype='float') p=Get_eNegX_Coefficients(x,y) 
+7
source share
1 answer

This seems to be a problem with your initial guesses; something like (1, 1, 1, 1) works fine: graph that looks good
You

 p_guess=(np.median(x),np.min(y),np.max(y),.01) 

for function

 def _eNegX_(p,x): x0,y0,c,k=py = (c * np.exp(-k*(x-x0))) + y0 return y 

So test_data_maxe ^ (-.01 (x - test_data_median)) + test_data_min

I am not very versed in the art of choosing good initial parameters, but I can say a few things. leastsq finds a local minimum here - the key to choosing these values ​​is to find the right mountain to climb, and not try to reduce the work that the minimization algorithm should perform. Your initial assumption is as follows ( green ): (1.736, 0.85299999999999998, 3.4889999999999999, 0.01) alt text

which results in your flat line (blue): (-59.20295956, 1.8562 , 1.03477144, 0.69483784)

Bigger gains were made by adjusting the line height than by increasing the value of k. If you know you are approaching this kind of data, use big k. If you don’t know, I think you could try to find a decent value of k by taking test data or bouncing off the slope between the average of the first half and second half, but I would not know how to go about it.

Edit: you can also start with a few guesses, run minimization several times, and take the line with the smallest residuals.

+11
source

All Articles