Is there a way to run sklearn's logistic regression with tanh?
I know tanh is better when labels are {-1,1} and sigmoid is better when labels are {0,1}
if I can't implement logistic regression would converting labels from {-1,1} -> {0, 1} would improve the performance of logistic regression with sigmoid activation function?
There is no such thing as tanh is better when labels are {-1,1} and sigmoid is better when they are {0,1}.
At the end, the model has no idea about the labels and their meaning. It just learns a probability distribution for binary classification.
tanh(x)
maps the input to the interval[-1, 1]
andsigmoid(x)
maps the input to the interval[0, 1]
. What you do is basically, you consider this as a probability and say, if the output is larger than 0.5, it belongs to class 1 otherwise to 0. (in case of sigmoid)So, yes you can convert your labels {-1,1} to {0,1}, or even to {9,10}