I passed two days trying to use Neural Structured language to adapt into me CNN Model I use ImageDataGenerator and flow_from_directory when I use model.fit_generator I got an error message: ValueError:
When passing input data as arrays, do not specify
steps_per_epoch
/steps
argument. Please usebatch_size
instead.
I use Keras 2.3.1 and TensorFlow 2.0 as backend
This is a snipped of my code:
num_classes = 4
img_rows, img_cols = 224, 224
batch_size = 16
train_datagen = ImageDataGenerator(
rescale=1./255,
rotation_range=30,
width_shift_range=0.3,
height_shift_range=0.3,
horizontal_flip=True,
fill_mode='nearest')
validation_datagen = ImageDataGenerator(rescale=1./255)
train_generator = train_datagen.flow_from_directory(
train_data_dir,
target_size=(img_rows, img_cols),
batch_size=batch_size, shuffle=True,
class_mode='categorical')
validation_generator = validation_datagen.flow_from_directory(
validation_data_dir,
target_size=(img_rows, img_cols),
batch_size=batch_size, shuffle=True,
class_mode='categorical')
def vgg():
model1 = Sequential([ ])
return model1
base_model = vgg()
I adapte Datagenerated from (x,y) format to a dictionary format
def convert_training_data_generator():
for x ,y in train_generator:
return {'feature': x, 'label':y}
def convert_testing_data_generator():
for x ,y in validation_generator:
return {'feature': x, 'label': y}
adv_config = nsl.configs.make_adv_reg_config(multiplier=0.2, adv_step_size=0.05)
model = nsl.keras.AdversarialRegularization(base_model, adv_config=adv_config)
train= convert_training_data_generator()
test= convert_testing_data_generator()
history = model.fit_generator(train,
steps_per_epoch= nb_train_samples // batch_size,
epochs = epochs,
callbacks = callbacks,
validation_data = test,
validation_steps = nb_validation_samples // batch_size)
I think here there is the same error. Maybe you should consider using instead
model.fit()
function. You should define in that case yourtrain input
yourtrain labels
and thebatch_size
. In order to figure out the difference betweenfit
andfit_generator
, you can follow that link.