In what scenario is maximizing of information gain not equivalent to minimizing of entropy? The broader question is why do we need the concept of information gain? Is it not sufficient to work only with entropy to decide the next optimal attribute of a decision tree?
Information gain vs minimizing entropy
2.5k Views Asked by Pradeep Vairamani At
1
There are 1 best solutions below
Related Questions in MATH
- bc: prevent "divide by zero" runtime error on multiple operations
- How to round smoothly percentage in JS
- Calculate if trend is up, down or stable
- How to pick a number based on probability?
- Python 2.7 - find combinations of numbers in a list that add to another number
- How to translate an object to a location slowly (so that it can be seen)
- max() implemented with basic operators
- Matlab: how to fit time series with a funcion of a certain type
- 3D B-Spline approximation
- Issues with adding doubles. Arithmetic Coding
- Calculate new position post rotation
- Javascript: PI (π) Calculator
- How to compute a^^b mod m?
- Need Custom Query in SQL Server
- Number of divisiors upto 10^6
Related Questions in DECISION-TREE
- Kaggle Titanic: Machine Learning From Disaster Decision Tree for Cabin Prediction
- Complex conditional filter design
- training and testing image data with neural network tool in MATLAB
- What is the equivalent to rpart.plot in Python? I want to visualize the results of my random forest
- Can I manually create an RWeka decision (Recursive Partitioning) tree?
- What is causing this StackOverflowError?
- Why do I get this error below while using the Cubist package in R?
- create decision tree from data
- Scaling plots in the terminal nodes of ctree graph
- Saving decision tree's output into a text file
- Implement all possible questions on a node in Decision Tree in Sklearn?
- Decision Tree nltk
- How to implement decision trees in boosting
- scikit learn decision tree export graphviz - wrong class names in the decision tree
- Random forests performed under expectation
Related Questions in ENTROPY
- How to calculate the entropy of a coin flip
- Behaviour of Address space layout randomization (ASLR) when entropy is low
- Range-coding with no ambiguous or illegal output
- Entropy calculation on a time serie moving window
- kullback leibler divergence limit
- Random 256bit key using SecRandomCopyBytes( ) in iOS
- Cross Entropy, Softmax and the derivative term in Backpropagation
- how to get the sum of alphabetical characters Shannon entropy
- Word VBA Script for Shannon entropy of the alphabetic characters
- Is there no information gain using entropy on 2 classes?
- Computing Shannon entropy of a HTTP header using Python. How to do it?
- Is CryptGenRandom() thread-safe?
- calculate Transfer entropy in r
- Is user delay between random takes is good improvement for PRNG?
- Computation of Mutual Information
Related Questions in INFORMATION-GAIN
- How to select only the best features by setting up the threshold using FSelector information gain in R language?
- Information Gain in R
- Does this middle variable have any information gain?
- Visualizing decision jungle in Azure Machine Learning Studio
- Numeral or Categorical split?
- what R Code to calculate the entropy for each level in a categorical variable
- MLR package: generateFilterValuesData chi.squared and information.gain
- Calculating Information Gain Ratio
- What should I do in case I have dominant feature in XGB model?
- Unable to run information.gain
- I got error message 'Boolean array expected for the condition, not int64'. Can anybody help me solve this problem?
- How to Use TF-IDF and combine it with Information Gain for feature selection in text classification?
- Negative value of Information Gain
- Calculating the entropy of a specific attribute?
- In R, how do I run a balanced 10-fold CV information gain test for feature selection on imbalanced 2-class data?
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Maximizing the IG (also known as Mutual Information) tends to provide the same result as minimizing the entropy.
Basically, if you minimize the entropy you're forcing Information Gain to be maximum.