How to use tanh instead of sigmoid in sklearn logistic regression

1.2k Views Asked by At

Is there a way to run sklearn's logistic regression with tanh?

I know tanh is better when labels are {-1,1} and sigmoid is better when labels are {0,1}

if I can't implement logistic regression would converting labels from {-1,1} -> {0, 1} would improve the performance of logistic regression with sigmoid activation function?

2

There are 2 best solutions below

1
On BEST ANSWER

There is no such thing as tanh is better when labels are {-1,1} and sigmoid is better when they are {0,1}.

At the end, the model has no idea about the labels and their meaning. It just learns a probability distribution for binary classification. tanh(x) maps the input to the interval [-1, 1] and sigmoid(x) maps the input to the interval [0, 1]. What you do is basically, you consider this as a probability and say, if the output is larger than 0.5, it belongs to class 1 otherwise to 0. (in case of sigmoid)

So, yes you can convert your labels {-1,1} to {0,1}, or even to {9,10}

0
On

Think again what a logistic regression is doing: logistic regression models a probability and a probability only ranges from 0 to 1.

Logistic regression is a little bit different from other ML classification models like SVM or tree-based models. Others try to find the decision boundaries directly while logistic regression actually models a probability and then use a threshold which can be any number from 0 to 1 to make the final classification.

Actually you can replace sigmoid function with any mathematical function turns a number into the range of 0 to 1. Examples would be the CDF of a normal distribution or complementary log-log.

If you change the sigmoid to tanh, that regression would no longer be a "logistic regression" because you are not modelling a probability.