著者
近藤 久 浅沼 由馬
出版者
一般社団法人 人工知能学会
雑誌
人工知能学会論文誌 (ISSN:13460714)
巻号頁・発行日
vol.34, no.2, pp.G-I36_1-11, 2019-03-01 (Released:2019-03-01)
参考文献数
28
被引用文献数
1

Hyperparameters optimization for learning algorithms and feature selection from given data are key issues in machine learning, would greatly affect classification accuracy. Random forests and support vector machines are among some of the most popular learning algorithms. Random forests that have relatively few hyperparameters, can perform more accurate classification by optimizing these parameters without requirement of feature selection. Same as random forests, support vector machines also have a few hyperparameters. However, whether or not to perform feature selection at the same time as optimization of these parameters greatly affects classification accuracy. Usually, grid search method is used to optimize hyperparameters. However, since this search method is performed on predetermined grids, the detailed optimization cannot be realized. In this paper, we thus introduce an artificial bee colony (ABC) algorithm to optimize hyperparameters and to perform more accurate feature selection. ABC algorithm is one of the swarm intelligence algorithms used to solve optimization problems which is inspired by the foraging behaviour of the honey bees. Using KDD Cup 1999 Data that is a benchmark of network intrusion detection classification, experimental results demonstrate the effectiveness of our method. The proposed method is superior in classification accuracies to existing methods for the same data, where swarm intelligence is used to hyperparameters optimization and feature selection. Our method also shows better performance than classification accuracies of random forests and SVM that are learned using default parameters values provided by scikit-learn, an open source machine learning library for Python.