Meta-learning to find optimal model from pre-trained models in Tensorflow

134 Views Asked by At

I have many pre-trained models with a different number of layers (Models are not Sequential). Training data had a shape (1, 1, 103) for these models and output was a class label between 0 and 9.

I loaded these saved models, set all layers as non-trainable. I used theses models into new architecture as follows:

inp = keras.layers.Input(shape=(1,1,103), name = "new_input")

out_1 = model_1(inp) # model_1 is the name of variable where I loaded trained model
out_2 = model_2(inp)
out_3 = model_3(inp)
out_4 = model_4(inp)

x = keras.layers.concatenate([out_1, out_2, out_3, out_4])
out = keras.layers.dense(1)(x)

model = keras.models.Model(inputs=inp, outputs=out, name = "meta_model")

When I compile this model with optimizer = "sgd" and loss = "mse".

I didn't get any error until this point, but when I run model.fit(), I get this error TypeError: int() argument must be a string, a bytes-like object or a number, not 'NoneType'

I'm not sure where I'm going wrong.

The previous models were trained with "adam" optimizer and "sparse_categorical_crossentropy" loss and the dataset had 10 classes.

The objective of this model was to train this model with the same data and try to find out which model amongst the previously trained model was optimal.

Any other solution/suggestion to find an optimal number of layers using meta-learning would also be appreciated. I can manually find the optimal number of layers by trial and error but I want meta-model to find this based on the dataset.

eg: by training on dataset1 I found that there was no significant increase in accuracy after 7 layers whereas for dataset2 it reached its peak at 4 layers and adding more layers was useless.

1

There are 1 best solutions below

0
On

For Hyperparameters tunning I can recommend Ray Tune. I use it and I like this framework very much.

https://docs.ray.io/en/latest/tune/examples/tune_mnist_keras.html