How to define a Laplacian loss function in Keras?

457 Views Asked by At

I have created an image edge detection deep learning model and I want to optimize the model's parameters using a Laplacian loss function in Keras Framework. The Laplacian loss function is defined by the equation: Lap_loss = MSE(Lap_gt, Lap_pred) where MSE is the mean square error; Lap_gt is the laplacian operator of the ground-truth image; Lap_pred is the laplacian operator of the predicted image.

I have defined the Laplacian loss function using the OpenCV Laplacian function Laplacian(), but I encountered incomprehensible errors when fitting the model. This loss function is defined as follows:

def lap_loss(y_true, y_pred):
  lap_true = cv2.Laplacian(y_true, ddepth=cv2.CV_64F, ksize=3)
  lap_pred = cv2.Laplacian(y_pred, ddepth=cv2.CV_64F, ksize=3)
  loss = keras.losses.MeanSquaredError(lap_true, lap_pred)
  return loss
0

There are 0 best solutions below