Cost-sensitive boosting algorithms as gradient descent
AdaBoost is a well known boosting method for generating strong ensemble of weak base learners. The procedure of AdaBoost can be fitted in a gradient descent optimization framework, which is important for analyzing and devising its procedure. Cost sensitive boosting (CSB) is an emerging subject exten...
Uloženo v:
| Vydáno v: | 2008 IEEE International Conference on Acoustics, Speech and Signal Processing s. 2009 - 2012 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
01.03.2008
|
| Témata: | |
| ISBN: | 9781424414833, 1424414830 |
| ISSN: | 1520-6149 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | AdaBoost is a well known boosting method for generating strong ensemble of weak base learners. The procedure of AdaBoost can be fitted in a gradient descent optimization framework, which is important for analyzing and devising its procedure. Cost sensitive boosting (CSB) is an emerging subject extending the boosting methods for cost sensitive classification applications. Most CSB methods are performed by directly modifying the original AdaBoost procedure. Unfortunately, the effectiveness of most cost sensitive boosting methods are checked only by experiments. It remains unclear whether these methods can be viewed as gradient descent procedures like AdaBoost. In this paper, we show that several typical CSB methods can also be view as gradient descent for minimizing a unified objective function. We then deduce a general greedy boosting procedure. Experimental results also validate the effectiveness of the proposed procedure. |
|---|---|
| ISBN: | 9781424414833 1424414830 |
| ISSN: | 1520-6149 |
| DOI: | 10.1109/ICASSP.2008.4518033 |

