Information gain vs minimizing entropy

2.5k Views Asked by At

In what scenario is maximizing of information gain not equivalent to minimizing of entropy? The broader question is why do we need the concept of information gain? Is it not sufficient to work only with entropy to decide the next optimal attribute of a decision tree?

1

There are 1 best solutions below

0
On

Maximizing the IG (also known as Mutual Information) tends to provide the same result as minimizing the entropy.

Basically, if you minimize the entropy you're forcing Information Gain to be maximum.