- 著者
-
上田 修功
- 出版者
- 一般社団法人 人工知能学会
- 雑誌
- 人工知能学会論文誌 (ISSN:13460714)
- 巻号頁・発行日
- vol.16, no.2, pp.299-308, 2001 (Released:2002-02-28)
- 参考文献数
- 18
- 被引用文献数
-
1
1
When learning a nonlinear model, we suffer from two difficulties in practice: (1) the local optima, and (2) appropriate model complexity determination problems. As for (1), I recently proposed the split and merge Expectation Maximization (SMEM) algorithm within the framework of the maximum likelihood by simulataneously spliting and merging model components, but the model complexity was fixed there. To overcome these problems, I first formally derive an objective function that can optimize a model over parameter and structure distributions simultaneously based on the variational Bayesian approach. Then, I device a Bayesian SMEM algorithm to e.ciently optimize the objective function. With the proposed algorithm, we can find the optimal model structure while avoiding being trapped in poor local maxima. I apply the proposed method to the learning of a mixture of experts model and show the usefulness of the method.