Ensemble Learning: Difference between revisions

From David's Wiki
(Created page with " ==Boosting== ==Bagging==")
 
Line 1: Line 1:


==Boosting==
==Boosting==
Reference [https://cs.nyu.edu/~mohri/mlbook/ Foundations of Machine Learning Chapter 6]<br>
Idea: Build a strong learner from a set of weak learners.
===Adaboost===
Learn a linear combination of our weak learners.
<pre>
Given a sample of size m
for i=1:m
  d_i=1/m
for t=1:T
  h_t <- classifier
  alpha_t <- (1/2)log((1-eps_t)/eps_t)
  z_t <- e[eps_t(1-eps_t)]^(1/2)
  for i=1:m
    D_{t+1} <- (D_t(i)exp(-alpha_t*y_i*h_t(x_i))/z_t
g <- sum alpha_t h_t
</pre>


==Bagging==
==Bagging==

Revision as of 15:37, 9 December 2019

Boosting

Reference Foundations of Machine Learning Chapter 6
Idea: Build a strong learner from a set of weak learners.

Adaboost

Learn a linear combination of our weak learners.

Given a sample of size m
for i=1:m
  d_i=1/m
for t=1:T
  h_t <- classifier
  alpha_t <- (1/2)log((1-eps_t)/eps_t)
  z_t <- e[eps_t(1-eps_t)]^(1/2)
  for i=1:m
    D_{t+1} <- (D_t(i)exp(-alpha_t*y_i*h_t(x_i))/z_t
g <- sum alpha_t h_t

Bagging