- 著者
-
福島 信純
永田 裕一
小林 重信
小野 功
- 出版者
- 進化計算学会
- 雑誌
- 進化計算学会論文誌 (ISSN:21857385)
- 巻号頁・発行日
- vol.4, no.2, pp.57-73, 2013 (Released:2013-09-11)
- 参考文献数
- 19
The natural evolution strategies (NESs) is a family of iterative methods for black-box function optimization. Instead of directly minimizing an objective function, NESs minimizes the expectation of the objective function value over an arbitrary parametric probability distribution. In each iteration, NESs updates parameters of the distribution by using an estimated natural gradient of the expectation of the objective function value. Exponential NES (xNES) is an effective method of NESs that uses the multivariate normal distribution as the probability distribution. Since the shape of a normal distribution can take the form of a rotated ellipse in the solution space, xNES shows relatively good performance for ill-conditioned and non-separable objective functions. However, we believe that xNES has two problems that cause performance degradation. The first problem is that the spread of normal distribution tends to shrink excessively even if the distribution does not cover a (local) optimal point. This will cause premature convergence. The second problem is that the learning rates for the parameters of distribution are not appropriate. The learning rates depend only on the dimension of objective function although they should be designed depending on all the factors that influence the precision of natural gradient estimation. Moreover, they are set to small values for preventing the premature convergence and these results in too slow convergence speed even if the distribution covers the optimal point. In order to remedy the problems of xNES, we propose a new method of NESs named the distance-weighted exponential natural evolution strategy (DX-NES). On several benchmark functions, we confirmed that DX-NES outperforms xNES and that DX-NES shows better performance than CMA-ES on the almost all functions.