著者
武田 利浩 田中 昭吉 丹野 州宣
出版者
一般社団法人日本応用数理学会
雑誌
日本応用数理学会論文誌 (ISSN:09172246)
巻号頁・発行日
vol.5, no.4, pp.399-409, 1995-12-15

Various types of neural networks have been proposed, and many applications of the technology have also been vigorously promoted in the wide range of the fields. However, simulations of large scale neural networks require quite high speed computation ability because of an enormous of time in learning. Then, many studies have been reported on efficient parallel simulation of neural networks. This paper proposes parallel computing algorithm allowing the back-propagation model to be simulated upon an 8-neighbor processor array. Taking account of the parallelism intrinsically imbedded in the neural networks, the algorithm realizes high speed neural network computation. The time complexities of the algorithm are only O(NLp/P)for communications and O(N^2L/P)for computation in one step learning processing, where N is the number of the neurons in a layer, P(pxp)is the number of processors, and L is the number of the layers.

言及状況

Twitter (1 users, 1 posts, 0 favorites)

収集済み URL リスト