Ensemble Learning: Difference between revisions

No edit summary
 
Line 20: Line 20:


==Bagging==
==Bagging==
[https://link.springer.com/article/10.1023/A:1018054314350 Bagging Predictors]<br>
Bootstrap aggregation<br>
Idea: Given a sample S, bootstrap from the sample to get m samples S_1,...,S_m.<br>
Then build m classifers from those samples<br>
Your new classifier is a linear combination of those classifiers<br>


==References==
==References==
* [https://link.springer.com/article/10.1023/A:1007607513941 An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization]
* [https://link.springer.com/article/10.1023/A:1007607513941 An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization]