I have to misunderstand something. Your published data does not look like
f(x,a,b) = np.exp(a*np.log(x)+np.log(b))
The red line is the result of scipy.optimize.curve_fit , the green line is the result of scidavis.
I assume that no algorithm converges to a good fit, so it is not surprising that the results do not match.
I canβt explain how scidavis finds its parameters, but according to the definitions, as I understand them, scipy finds parameters with the remnants of the least squares than scidavis :
import numpy as np import matplotlib.pyplot as plt import scipy.optimize as optimize def func(x, a, b): return np.exp(a* np.log(x)+np.log(b)) def sum_square(residuals): return (residuals**2).sum() def residuals(p, x, y, sigma): return 1.0/sigma*(y - func(x, *p)) data = np.loadtxt('test.dat').reshape((-1,3)) x, y, yerr = np.rollaxis(data, axis = 1) sigma = yerr popt, pcov = optimize.curve_fit(func, x, y, sigma = sigma, maxfev = 10000) print('popt: {p}'.format(p = popt)) scidavis = (0.14154, 7.38213) print('scidavis: {p}'.format(p = scidavis)) print('''\ sum of squares for scipy: {sp} sum of squares for scidavis: {d} '''.format( sp = sum_square(residuals(popt, x = x, y = y, sigma = sigma)), d = sum_square(residuals(scidavis, x = x, y = y, sigma = sigma)) )) plt.plot(x, y, 'bo', x, func(x,*popt), 'r-', x, func(x, *scidavis), 'g-') plt.errorbar(x, y, yerr) plt.show()
gives
popt: [ 0.86051258 3.38081125] scidavis: (0.14154, 7.38213) sum of squares for scipy: 53249.9915654 sum of squares for scidavis: 239654.84276
