Cost-sensitive boosting algorithms as gradient descent

AdaBoost is a well known boosting method for generating strong ensemble of weak base learners. The procedure of AdaBoost can be fitted in a gradient descent optimization framework, which is important for analyzing and devising its procedure. Cost sensitive boosting (CSB) is an emerging subject exten...

Full description

Saved in:
Bibliographic Details
Published in:2008 IEEE International Conference on Acoustics, Speech and Signal Processing pp. 2009 - 2012
Main Authors: Qu-Tang Cai, Yang-Qui Song, Chang-Shui Zhang
Format: Conference Proceeding
Language:English
Published: IEEE 01.03.2008
Subjects:
ISBN:9781424414833, 1424414830
ISSN:1520-6149
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:AdaBoost is a well known boosting method for generating strong ensemble of weak base learners. The procedure of AdaBoost can be fitted in a gradient descent optimization framework, which is important for analyzing and devising its procedure. Cost sensitive boosting (CSB) is an emerging subject extending the boosting methods for cost sensitive classification applications. Most CSB methods are performed by directly modifying the original AdaBoost procedure. Unfortunately, the effectiveness of most cost sensitive boosting methods are checked only by experiments. It remains unclear whether these methods can be viewed as gradient descent procedures like AdaBoost. In this paper, we show that several typical CSB methods can also be view as gradient descent for minimizing a unified objective function. We then deduce a general greedy boosting procedure. Experimental results also validate the effectiveness of the proposed procedure.
ISBN:9781424414833
1424414830
ISSN:1520-6149
DOI:10.1109/ICASSP.2008.4518033