Assing time as the learning termination criteria of Keras sequential Neural Network

16 Views Asked by At

I have developed a NN model as below which has two hidden layers with 19 nodes. The initial activation is "tanh".

As you can see I had 2 initial callback criterias. The first one (es) is based on EarlyStopping if the accuracy does not change in 10 consequtive epoches and the second one (time_callback) calculate time past in each epoch.

After running this model, I have the total time past till we reach the final epoch (call this "t").

Now the question is, I have to run the model again but this time use for example "relu" as the activation. However, I have to remove the "es" and the other callback criteria and instead set that calculated time "t" as the stopping criteria. In another world the second model with "relu" function should only continue to learn till time "t" and then stops!

Does anyone knew how can I set this up?!

class TimeHistory(keras.callbacks.Callback):
    def on_train_begin(self, logs={}):
        self.times = []

    def on_epoch_begin(self, batch, logs={}):
        self.epoch_time_start = time.time()

    def on_epoch_end(self, batch, logs={}):
        self.times.append(time.time() - self.epoch_time_start)
        

model = Sequential()
model.add(Dense(19, input_shape=[X_train.shape[1]], activation='tanh', name="layer1"))
model.add(Dense(19, activation='tanh', name="layer2"))
model.add(Dense(1, activation='sigmoid', name="Output"))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

time_callback = TimeHistory()

es = EarlyStopping(monitor='val_accuracy', mode='max', verbose=1, patience=10)

# fit network
history = model.fit(X_train, y_train,
                    validation_split=0.3,
                    epochs=1000,
                    batch_size=10,
                    shuffle=False,
                    verbose=1,
                    callbacks=[es,time_callback])

times = time_callback.times

#Then we move to second model with 'relu' function
model2 = Sequential()
model2.add(Dense(19, input_shape=[X_train.shape[1]], activation='relu', name="layer1"))
model2.add(Dense(19, activation='relu', name="layer2"))
model2.add(Dense(1, activation='sigmoid', name="Output"))
model2.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
history2 = model2.fit(X_train, y_train,
                     validation_split=0.3,
                     epochs=100,
                     batch_size=10,
                     shuffle=False,
                     verbose=1,
                     callbacks=????????????????)
0

There are 0 best solutions below