LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The original paper[1] casts the AdaBoost algorithm into a statistical framework. Specifically, if one considers AdaBoost as a generalized additive model and then applies the cost functional of logistic regression, one can derive the LogitBoost algorithm.
Minimizing the LogitBoost cost functional[edit]
LogitBoost can be seen as a convex optimization. Specifically, given that we seek an additive model of the form
![f = \sum_t \alpha_t h_t](http://upload.wikimedia.org/math/4/1/7/417309b1e5c3543e059a5994c3b208f4.png)
the LogitBoost algorithm minimizes the logistic loss:
![\sum_i \log\left( 1 + e^{-y_i f(x_i)}\right)](http://upload.wikimedia.org/math/b/8/7/b87e33f94db2c95cadcdf796cdb95ff5.png)
References[edit]
See also[edit]