# Ensemble Learning

Jump to navigation
Jump to search

## Contents

## Boosting

Reference Foundations of Machine Learning Chapter 6

Idea: Build a strong learner from a set of weak learners.

### Adaboost

Learn a linear combination of our weak learners.

Given a sample of size m for i=1:m d_i=1/m for t=1:T h_t <- classifier alpha_t <- (1/2)log((1-eps_t)/eps_t) z_t <- e[eps_t(1-eps_t)]^(1/2) for i=1:m D_{t+1} <- (D_t(i)exp(-alpha_t*y_i*h_t(x_i))/z_t g <- sum alpha_t h_t

## Bagging

Bagging Predictors

Bootstrap aggregation

Idea: Given a sample S, bootstrap from the sample to get m samples S_1,...,S_m.

Then build m classifers from those samples

Your new classifier is a linear combination of those classifiers