- 著者
-
岩倉 友哉
岡本 青史
淺川 和雄
- 出版者
- 一般社団法人 電気学会
- 雑誌
- 電気学会論文誌C(電子・情報・システム部門誌) (ISSN:03854221)
- 巻号頁・発行日
- vol.130, no.1, pp.83-91, 2010-01-01 (Released:2010-01-01)
- 参考文献数
- 26
- 被引用文献数
-
2
AdaBoost is a method to create a final hypothesis by repeatedly generating a weak hypothesis in each training iteration with a given weak learner. AdaBoost-based algorithms are successfully applied to several tasks such as Natural Language Processing (NLP), OCR, and so on. However, learning on the training data consisting of large number of samples and features requires long training time. We propose a fast AdaBoost-based algorithm for learning rules represented by combination of features. Our algorithm constructs a final hypothesis by learning several weak-hypotheses at each iteration. We assign a confidence-rated value to each weak-hypothesis while ensuring a reduction in the theoretical upper bound of the training error of AdaBoost. We evaluate our methods with English POS tagging and text chunking. The experimental results show that the training speed of our algorithm are about 25 times faster than an AdaBoost-based learner, and about 50 times faster than Support Vector Machines with polynomial kernel on the average while maintaining state-of-the-art accuracy.