I am trying to fit a power law to some data following a power law with noise, displayed in a log-log scale:
The fit with scipy curve_fit is the orange line and the red line is the noiseless power law.
Here is the code:
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
def powerFit(x, A, B):
return A * x**(-B)
x = np.logspace(np.log10(1),np.log10(1e5),30)
A = 1
B = 2
np.random.seed(1)
y_noise = np.random.normal(1, 0.2, size=x.size)
y = powerFit(x, A, B) * y_noise
plt.plot(x, y, 'o')
plt.xscale('log')
plt.yscale('log')
popt, pcov = curve_fit(powerFit, x, y, p0 = [A, B])
perr = np.sqrt(np.diag(pcov))
residuals = y - powerFit(x, *popt)
ss_res = np.sum(residuals**2)
ss_tot = np.sum((y-np.mean(y))**2)
r_squared = 1 - (ss_res / ss_tot)
plt.plot(x, powerFit(x, *popt),
label = 'fit: A*x**(-B)\n A=%.1e +- %.0e\n B=%.1e +- %.0e\n R2 = %.2f' %tuple(np.concatenate(( np.array([popt, perr]).T.flatten(), [r_squared]) )))
plt.plot(x, powerFit(x, A, B), 'r--', label = 'A = %d, B = %d'%(A,B))
plt.legend()
plt.savefig("fig.png", dpi = 300)
I do not understand what is happening with the fitted power law. Why does it look wrong? How could I solve this?
Note: I know you could also fit the power law plotting log(y) vs log(x). But according to this answer, it seems curve_fit should also manage to do it right directly. So my question is if it is possible to do a power law fit in a log log scale without log transforming. I am interested in avoiding the log-log transformation because it is not possible to apply to any fit (consider for example the fit to y = A*x**(-Bx) ).