I'm attempting to create a TemporalFusionTransformer with my own data, but following this example.

I get this error when instantiating a TemporalFusionTransformer or using optimize_hyperparameters:

Error:

/usr/local/lib/python3.8/dist-packages/pytorch_forecasting/models/base_model.py in __init__(self, log_interval, log_val_interval, learning_rate, log_gradient_flow, loss, logging_metrics, reduce_on_plateau_patience, reduce_on_plateau_reduction, reduce_on_plateau_min_lr, weight_decay, optimizer_params, monotone_constaints, output_transformer, optimizer)
    260         init_args = get_init_args(frame)
    261         self.save_hyperparameters(
--> 262             {name: val for name, val in init_args.items() if name not in self.hparams and name not in ["self"]}
    263         )
    264 

AttributeError: 'tuple' object has no attribute 'items'

My Code:

study = optimize_hyperparameters(
    train_dataloader,
    val_dataloader,
    model_path="optuna_test",
    n_trials=200,
    max_epochs=50,
    gradient_clip_val_range=(0.01, 1.0),
    hidden_size_range=(8, 128),
    hidden_continuous_size_range=(8, 128),
    attention_head_size_range=(1, 4),
    learning_rate_range=(0.001, 0.1),
    dropout_range=(0.1, 0.3),
    trainer_kwargs=dict(limit_train_batches=30),
    reduce_on_plateau_patience=4,
    use_learning_rate_finder=False,  # use Optuna to find ideal learning rate or use in-built learning rate finder
)
tft = TemporalFusionTransformer.from_dataset(
    training,
    # not meaningful for finding the learning rate but otherwise very important
    #learning_rate=0.03,
    #hidden_size=16,  # most important hyperparameter apart from learning rate
    # number of attention heads. Set to up to 4 for large datasets
    #attention_head_size=1,
    #dropout=0.1,  # between 0.1 and 0.3 are good values
    #hidden_continuous_size=8,  # set to <= hidden_size
    #output_size=7,  # 7 quantiles by default
    #loss=metrics.quantile.QuantileLoss(),
    # reduce learning rate if no improvement in validation loss after x epochs
    #reduce_on_plateau_patience=4
)

I'm assuming something is wrong in my TimeSeriesDataSet instantiation, but can't figure it out:

#Create a TimeSeriesDataSet
max_prediction_length = 7*4*6   #24 weeks
training_cutoff = data["time_indx"].max() - max_prediction_length

training = TimeSeriesDataSet(
    data[lambda x: x.time_indx <= training_cutoff],
    group_ids=["group_id"],
    target="Close",
    time_idx="time_indx",
    min_encoder_length=3,
    max_encoder_length=30,
    min_prediction_length=5,
    max_prediction_length=max_prediction_length,
    time_varying_known_reals=['Date', 'Open', 'High', 'Low', 'Volume'],
    time_varying_unknown_reals=["Close"]
)

I've tried uninstalling and installing pytorch-lightning, pytorch-forecasting, but this didn't work.

Any ideas?

1

There are 1 best solutions below

0
On

so I faced the same issue with previously working code that suddendly started failing with the same error. I was able to fix it by explicitly setting the pytorch-ligntning version to 1.8.6 (the most current one is 1.9.2 iirc which probably breaks something). Let me know if this was useful to you