In what scenario is maximizing of information gain not equivalent to minimizing of entropy? The broader question is why do we need the concept of information gain? Is it not sufficient to work only with entropy to decide the next optimal attribute of a decision tree?
Information gain vs minimizing entropy
2.5k Views Asked by Pradeep Vairamani At
1
There are 1 best solutions below
Related Questions in MATH
- How to restrict vpasolve() to only integer solutions (MATLAB)
- Need clarification on VHDL expressions involving std_logic_vector, unsigned and literals, unsure about compiler interpretation
- What is the algorithm behind math.gcd and why it is faster Euclidean algorithm?
- How to throw a charged particle in a electric vector field?
- Issues with a rotation gizmo and sign flips when converting back to euler angles
- Solving the area of a 2 dimensional shape
- WorldToScreen function
- Algorithm to find neighbours of point by distance with no repeats
- Detecting Circles and Ellipses from Point Arrays in Java
- three parameter log normal distribution
- Bound for product of matrices
- Javascript animation taking incorrect amount of time to reach desired location
- Converting Math.js-like Expressions to Runnable Python Code
- Looking for a standard mathematical function that returns 0 if x = 0 and a constant k when x <> 0
- Partitions in co-lexicographic order (PARI/GP algorithm without recursion)
Related Questions in DECISION-TREE
- Decision tree using rpart for factor returns only the first node
- ValueError: The feature names should match those that were passed during fit
- Creating Tensorflow decision forests from individual trees
- How to identify feature names from indices in a decision tree using scikit-learn’s CountVectorizer?
- How does persisting the model increase accuracy?
- XGBoost custom & default objective and evaluation functions
- AttributeError: 'RandomForestRegressor' object has no attribute 'tree_'. How do i resolve?
- Problem with Decision Tree Visualization in Weka: sorry there is no instances data for this node
- How can I limit the depth of a decision tree using C4.5 in Weka?
- Error when importing DecisionTreeClassifier from sklearn
- i have loaded a csv file in weka tool but J48 is not highlight
- how to change rules name? (chefboost)
- Why DecisionTreeClassifier split wrongly the data with the specified criterion?
- How to convert string to float, dtype='numeric' is not compatible with arrays of bytes/strings.Convert your data to numeric values explicitly instead
- Multivariate regression tree with "mvpart" (in R) and plots for each leaf of the tree visualization
Related Questions in ENTROPY
- Deterministic CTR_DRBG in OpenSSL
- rouble with mounting Python code to FUSE: No response and prolonged processing ---
- Shannon's entropy on FTIR data
- Dissimilarity Index using Segregation Package
- Arithmetic encoding implementation outputting chars
- Python entropy calculation of trigrams
- JS divergence code is giving wrong answers
- I can't work out the entropy formula for TraMineR
- Is there an error with my sample entropy estimator for 3D multivariate signal?
- Use "/dev/urandom" with GMP library to generate a very big random number
- Can the total Entropy of all clusters be greater than 1, after classification?
- How can I calculate in MATLAB the entropy and mutual information of points moving through 3D space?
- Calculating Chao-Shen corrected Jensen-Shannon Divergence
- KL divergence corrected for limited sample size bias
- Error: Image dimensions and neighborhood dimensions do not match
Related Questions in INFORMATION-GAIN
- what R Code to calculate the entropy for each level in a categorical variable
- Information gain for decision tree in Weka
- Does this middle variable have any information gain?
- In R, how do I run a balanced 10-fold CV information gain test for feature selection on imbalanced 2-class data?
- I got error message 'Boolean array expected for the condition, not int64'. Can anybody help me solve this problem?
- How to save the result of feature selection in Weka?
- What should I do in case I have dominant feature in XGB model?
- calculating entropy and information gain python
- How to Use TF-IDF and combine it with Information Gain for feature selection in text classification?
- Feature importance 'gain' in XGBoost
- Visualizing decision jungle in Azure Machine Learning Studio
- How can I receive from a column data points with the highest Information (Gain)?
- MLR package: generateFilterValuesData chi.squared and information.gain
- How to use functions of library in python?
- Getting negative information gain with Laplace smoothing
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
Maximizing the IG (also known as Mutual Information) tends to provide the same result as minimizing the entropy.
Basically, if you minimize the entropy you're forcing Information Gain to be maximum.