Is there a way to plot the Training-Losses from a model, which is trained with the time series library "Darts"? i am working with RNNs from the Darts-library for TimeSeries Forecasting. i am wondering, if there is a way to plot the training losses over the epochs.
with keras i do something like this:
model = Sequential()
model.add(LSTM(100, activation='relu', input_shape=(n_input, n_features)))
model.add(Dense(1))
model.compile(optimizer='adam', loss='mse')
loss_per_epoch = model.history.history['loss'] plt.plot(range(len(loss_per_epoch)),loss_per_epoch)
plt.show()
maybe there is a way via tensorboard?
It looks like you can wrap your data in a
darts.TimeSeries
object, and then access the plotting from there. E.g.: