I am building a DNN with a custom loss function and I am training this DNN using Gradient Tape in TensorFlow.kerasenter code here
. The code runs without any errors, however, as far as I can check the weights of the DNN, the weights were not being updated at all. I followed exactly what recommends from the TensorFlow website and search for the answers but still don't understand what is the reason. Here is my code:
import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import Input, Dense, LeakyReLU, Concatenate
from tensorflow.keras.models import Model
from tensorflow.keras import backend as K
from tensorflow.keras import optimizers
# Generate a random train data
c0_train = np.array([30 * np.random.uniform() for i in range(10000)])
# Build a simple DNN
c0_input = Input(shape=(1,), name='c0')
hidden_1 = Dense(100)(c0_input)
activation_1 = LeakyReLU(alpha=0.1)(hidden_1)
hidden_2 = Dense(100)(activation_1)
activation_2 = LeakyReLU(alpha=0.1)(hidden_2)
hidden_3 = Dense(100)(activation_2)
activation_3 = LeakyReLU(alpha=0.1)(hidden_3)
x0_output = Dense(1, name='x0')(activation_3)
model = Model(inputs=c0_input, outputs=x0_output)
# Calculating the loss function
def cal_loss(c0_input):
x0_output = model(c0_input)
loss = tf.reduce_mean(
tf.multiply(c0_input, tf.square(tf.subtract(x0_output, c0_input))))
return loss
# Compute the gradient calculation
@tf.function
def compute_loss_grads(c0_input):
with tf.GradientTape() as tape:
loss = cal_loss(c0_input)
grads = tape.gradient(loss, model.trainable_variables)
return loss, grads
# Optimizer
opt = optimizers.Adam(learning_rate=0.01)
# Start looping
for epoch in range(50):
print('Epoch = ', epoch)
# Compute the loss and gradients
[loss, grads] = compute_loss_grads(tf.cast(c0_train, tf.float32))
# Adjust the weights of the model
opt.apply_gradients(zip(grads, model.trainable_variables))
I have checked the weights of the model using model.get_weights()
and they look exactly the same before and after running the loop. So what is the problem here? And one more question, how can I print out the loss for every epoch?
The weight does change. You can check as follows; after building the model save your weights file (these are initial weight).
Now, run your training loop. After finishing the training, get the new weights as follows and compare before and after.
And this is how you can print your loss score in every epoch.