I've got a bit of a problem on implementing my own custom objective function
I'm gonna use for LightGBMClassifier.
I'm trying to reproduce the log loss function (I've read
somewhere that is also called cross-entropy or something.
Please do correct me about this, I really still don't know a lot.)
and use it as my objective
function. From what I've read all over the net it seems that it is
the default loss function for binary classification and there's really
no need to make my own.
This is my codes:
def my_naive_log_loss_function(y_true, y_pred, eps=1e-15):
y_pred = np.clip(y_pred, eps, 1 - eps)
return -y_true * np.log(y_pred) - (1 - y_true) * np.log(1 - y_pred)
def my_naive_log_loss_metric(y_true, y_pred):
loss = np.mean(my_naive_log_loss_function(y_true, y_pred))
return "my_naive_log_loss_metric", loss, False
def my_naive_log_loss_objective(y_true, y_pred):
func = lambda z: my_naive_log_loss_function(y_true, z)
gradient = derivative(func, y_pred, n=1, dx=1e-6)
hessian = derivative(func, y_pred, n=2, dx=1e-6)
return gradient, hessian
custom_loss_ens = LGBMClassifier(objective=my_naive_log_loss_objective, boosting_type="dart", n_estimators=100, max_depth=10, random_state=42)
custom_loss_ens.fit(X_train, y_train)
When I call custom_loss_ens.predict(X_val) it throws an error
Cannot compute class probabilities or labels due to the usage of customized objective function. Returning raw scores instead. _log_warning("Cannot compute class probabilities or labels ")
and just outputs an array of zeros.
How can I do this properly?