Ensemble Learning: Difference between revisions
Appearance
No edit summary |
|||
| Line 20: | Line 20: | ||
==Bagging== | ==Bagging== | ||
==References== | |||
* [https://link.springer.com/article/10.1023/A:1007607513941 An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization] | |||
Revision as of 15:38, 9 December 2019
Boosting
Reference Foundations of Machine Learning Chapter 6
Idea: Build a strong learner from a set of weak learners.
Adaboost
Learn a linear combination of our weak learners.
Given a sample of size m
for i=1:m
d_i=1/m
for t=1:T
h_t <- classifier
alpha_t <- (1/2)log((1-eps_t)/eps_t)
z_t <- e[eps_t(1-eps_t)]^(1/2)
for i=1:m
D_{t+1} <- (D_t(i)exp(-alpha_t*y_i*h_t(x_i))/z_t
g <- sum alpha_t h_t