How do I speed up lmfit as much as possible?

649 Views Asked by At

I'm using lmfit to fit a skewed gaussian function to a large number of individual dataset (10 000). I get very good results but the time it takes to fit 10 000 pixels is quite long so every milisecond I can shave of the fit time would be helpful. This is the code I'm using where x and y is the data I want to fit. The guesses of parameters are working really well for me but were produced mostly through trial and error.

import lmfit as lm
from lmfit import Model
from lmfit.models import GaussianModel, ConstantModel, ExponentialGaussianModel, SkewedGaussianModel
from lmfit import Parameters

def LM_skewedgauss(x,y):
    supermodel = ConstantModel() + SkewedGaussianModel()
    x = x
    
    # Guesses
    
    a_peak = np.max(y)
    #16 is a needed constant from the way the data is produced
    t_peak = np.where(y == a_peak)[0][0]*16 
    avg = np.mean(y)
    gamma = 1.5
    sigma = 31

    params = supermodel.make_params(amplitude = a_peak*sigma*np.sqrt(2*np.pi),
                                    center = t_peak,
                                    sigma = sigma,
                                    gamma = gamma,
                                    c = 3)
    result = supermodel.fit(counts, params = params, x = x)
    #result.plot()

    bestparam = result.params
1

There are 1 best solutions below

0
On

As with Python itself, lmfit is designed to optimize developer (or probably more often "scientist") time. With that view in mind, getting a 10x speedup of 10,000 fits is then actually pretty simple (and inexpensive): Get a machine with 10+ cores (or compute threads), and split the problem into 10 scripts doing 1,000 fits each.