How can I reduce the batch size after a certain epoch or implement a adaptive batch size in deep learning?

356 Views Asked by At

I've implemented a BiLSTM model for classifying positive or negative sentiments from texts. Here, my batch size is constant in this code, which is 256 and epoch 60, but I want to reduce the batch size after 10 epochs. For example, after 10 epochs the batch size will be 128 from 256, then again after 10 epochs it will be 64 and so on. Is it possible to do that? If possible, how?

BATCH_SIZE = 256
NUM_EPOCHS = 60

modelBiLstm=Sequential()
modelBiLstm.add(Embedding(vocab_sz, EMBED_SIZE, input_length=maxlen, embeddings_initializer=Constant(embed_matrix)))
modelBiLstm.add(Bidirectional(LSTM(256,return_sequences=True)))
modelBiLstm.add(Dropout(0.2))
modelBiLstm.add(Dense(128,activation='relu'))
modelBiLstm.add(Dropout(0.2))
modelBiLstm.add(BatchNormalization())
modelBiLstm.add(Dense(2,activation='softmax'))

modelBiLstm.compile(loss='binary_crossentropy',optimizer='adam',metrics=['accuracy'])

history = modelBiLstm.fit(Xtrain, Ytrain, batch_size=BATCH_SIZE, epochs=NUM_EPOCHS, validation_data=(Xtest, Ytest), callbacks=[lr_finder])
0

There are 0 best solutions below