Importing data to tensorflow autoencoders through ImageDataGenerator

166 Views Asked by At

When I try to train autoencoder by importing images as numpy arrays the training proceeds quickly with the training loss at first epoch itself < 0 and the results are also decent.

But when I import same data through ImageDataGenerator the starting loss is around 32000 and as training proceeds it decreases very slowly and after 50 epochs it saturates at around 31000. I used mse as loss function with Adam Optimiser. I tried different loss functions but the problem persists like Very high Value at start which saturates very quickly to significantly high value. Any suggestions are welcomed. Thanks.

following is my code.

from convautoencoder import ConvAutoencoder
from tensorflow.keras.optimizers import Adam
import numpy as np
import cv2
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.callbacks import EarlyStopping, ModelCheckpoint
from tensorflow.config import  experimental
from tensorflow.python.client import device_lib
devices = experimental.list_physical_devices('GPU')
experimental.set_memory_growth(devices[0], True)
EPOCHS = 5000
BS = 4
trainAug = ImageDataGenerator()
valAug = ImageDataGenerator()
# initialize the training generator
trainGen = trainAug.flow_from_directory(
    config.TRAIN_PATH,
    class_mode="input",
    classes=None,
    target_size=(64, 64),
    color_mode="grayscale",
    shuffle=True,
    batch_size=BS)
# initialize the validation generator
valGen = valAug.flow_from_directory(
    config.TRAIN_PATH,
    class_mode="input",
    classes=None,
    target_size=(64, 64),
    color_mode="grayscale",
    shuffle=False,
    batch_size=BS)
# initialize the testing generator
testGen = valAug.flow_from_directory(
    config.TRAIN_PATH,
    class_mode="input",
    classes=None,
    target_size=(64, 64),
    color_mode="grayscale",
    shuffle=False,
    batch_size=BS)

mc = ModelCheckpoint('best_model_1.h5', monitor='val_loss', mode='min', save_best_only=True)
print("[INFO] building autoencoder...")
(encoder, decoder, autoencoder) = ConvAutoencoder.build(64, 64, 1)
opt = Adam(learning_rate= 0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-04, amsgrad=False)
autoencoder.compile(loss="hinge", optimizer=opt)
H = autoencoder.fit(    trainGen,   validation_data=valGen, epochs=EPOCHS, batch_size=BS ,callbacks=[ mc])
1

There are 1 best solutions below

0
On

Ok. This was a silly mistake.

Adding rescale factor rescale=1. / 255 to imageDataGenerator solved the problem.