How can I loop in a symbolic regression training?

129 Views Asked by At

I'm trying to obtain an equation that explains the convergence of a physics equation. I have a lot of data on equations and I would like to find a similar behavior for all since they mean the same thing but with different parameters. In order to do that, I've tried to loop all these equations in GPLearn symbolic regression training, but as expected, in each iteration we have a different equation in output. How can I loop over all these series and find the best equation that fits reasonably well to all data? Or How can I "merge" all these output equations to find the best one?

from gplearn.genetic import SymbolicRegressor
from sympy import * 
from sklearn.model_selection import train_test_split
from keras.engine.training import Model
count = 0
for i in data_frame.columns:
  print(i)
  count +=1
  if count<600:
    new_df = data_frame.iloc[:,[2*i, 2*i+1]]
    new_df = new_df.dropna()
    #new_df = new_df.iloc[: , :1]
    #values = new_df.values
    X = new_df.iloc[:,0].values
    y = new_df.iloc[:,1].values
    X, y = np.array(X).reshape(-1,1), np.array(y).reshape(-1,1)
    y_true = y
    #X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.30, random_state=42)

    function_set = ['add', 'sub', 'mul', 'div','cos','sin','neg','inv']
    est_gp = SymbolicRegressor(population_size=5000,function_set=function_set,
                           generations=40, stopping_criteria=0.01,
                           p_crossover=0.7, p_subtree_mutation=0.1,
                           p_hoist_mutation=0.05, p_point_mutation=0.1,
                           max_samples=0.9, verbose=1,
                           parsimony_coefficient=0.01, random_state=0)
    
    converter = {
    'sub': lambda x, y : x - y,
    'div': lambda x, y : x/y,
    'mul': lambda x, y : x*y,
    'add': lambda x, y : x + y,
    'neg': lambda x    : -x,
    'pow': lambda x, y : x**y,
    'sin': lambda x    : sin(x),
    'cos': lambda x    : cos(x),
    'inv': lambda x: 1/x,
    'sqrt': lambda x: x**0.5,
    'pow3': lambda x: x**3
    }
    est_gp.fit(X, y)
    #print('R2:',est_gp.score(X_test,y_test))
    next_e = sympify((est_gp._program), locals=converter)
0

There are 0 best solutions below