I am trying to understand why the model's prediction is independent of the changes of covariates. Given any random target series : y_train_trans and past_Covariate series : X_train_trans , i am finding that pred1 and pred 2 always end up exactly the same no matter how i change X_train_trans.

mymodel = XGBModel(lags=3, lags_past_covariates=2,
                 output_chunk_length=16,max_depth =5)

mymodel2 = XGBModel(lags=3, lags_past_covariates=2,
                 output_chunk_length=16,max_depth =5)

mymodel.fit(y_train_trans,past_covariates=X_train_trans)
mymodel2.fit(y_train_trans,past_covariates=X_train_trans*5)

pred1 = mymodel.predict(5,past_covariates=X_train_trans)
pred2 = mymodel2.predict(5,past_covariates= X_train_trans*5)

print(pred1-pred2) # always gives 0
1

There are 1 best solutions below

0
Julien Herzen On BEST ANSWER

This is probably because XGBoost is invariant to scaling features here. Try changing the actual shape of the covariates series (rather than simply scaling) and the results could be different.