This seems to be a problem with your initial guesses; something like (1, 1, 1, 1) works fine: 
You
p_guess=(np.median(x),np.min(y),np.max(y),.01)
for function
def _eNegX_(p,x): x0,y0,c,k=py = (c * np.exp(-k*(x-x0))) + y0 return y
So test_data_maxe ^ (-.01 (x - test_data_median)) + test_data_min
I am not very versed in the art of choosing good initial parameters, but I can say a few things. leastsq finds a local minimum here - the key to choosing these values ββis to find the right mountain to climb, and not try to reduce the work that the minimization algorithm should perform. Your initial assumption is as follows ( green ): (1.736, 0.85299999999999998, 3.4889999999999999, 0.01) 
which results in your flat line (blue): (-59.20295956, 1.8562 , 1.03477144, 0.69483784)
Bigger gains were made by adjusting the line height than by increasing the value of k. If you know you are approaching this kind of data, use big k. If you donβt know, I think you could try to find a decent value of k by taking test data or bouncing off the slope between the average of the first half and second half, but I would not know how to go about it.
Edit: you can also start with a few guesses, run minimization several times, and take the line with the smallest residuals.
Thomas
source share