Fast Kernel Regression in Python?

183 Views Asked by At

I am running repeated simulations in the context of 1D non-parametric regression in order to evaluate the performance of some non-parametric methods. the Nadayara-Watson kernel estimate is one of these methods. But, i am struggling a bit with KernelReg, that is too slow to my purpose. First i though that this slowdown came from the adaptive selection of the bandwith parameter and, as in my simulation the regularity of the signal is fixed, i try specifiying the bandwith (taking the one that leads "theoritically" to the optimal convergence rate), but this is even slower, not sure why.

Is there a way to make the computation faster or an other package that provide a faster kernel regression estimate ?

Here is a simplified example of what i do :

import numpy as np
from numpy import linalg as LA
from statsmodels.nonparametric.kernel_regression import KernelReg

##define true signal
def f(x):
  return(np.cos(x*(2*np.pi)))

L=[250,500,750,1000,1250,1500,1750,2000] ##list of increasing number of observations 
l_NW=[] ##list of errors of all simulations (structured as len(L) by 100 array)
for m in L:
  S_NW=[] ##list of errors for all simulations for a fixed number of observations
  for j in range(100): ## repeat 100 times for each number of observations
    x = np.zeros(m+1)
    l = np.zeros(m+1)
    for i in range(m+1):
      x[i] = f(i/m) + np.random.randn() ##construct noisy signal
      l[i]=i/m
    ##compute Nadayar-Watson estimates
    kde = KernelReg(x, l,var_type='c') 
    f_NW=kde.fit(l)[0]
    err=LA.norm(np.array(f_NW)-np.array([f(x) for x in l])) ##compute error of estimation
    S_NW.append(err)
  l_NW.append(S_NW)
0

There are 0 best solutions below