著者
Shun-ichi Amari Ryo Karakida Masafumi Oizumi
出版者
The Institute of Electronics, Information and Communication Engineers
雑誌
Nonlinear Theory and Its Applications, IEICE (ISSN:21854106)
巻号頁・発行日
vol.10, no.4, pp.322-336, 2019 (Released:2019-10-01)
参考文献数
9

Deep neural networks are highly nonlinear hierarchical systems. Statistical neurodynamics studies macroscopic behaviors of randomly connected neural networks. We consider a deep feedforward network where input signals are processed layer by layer. The manifold of input signals is embedded in a higher dimensional manifold of the next layer as a curved submanifold, provided the number of neurons is larger than that of inputs. We show geometrical features of the embedded manifold, proving that the manifold enlarges or shrinks locally isotropically so that it is always embedded conformally. We study the curvature of the embedded manifold. The scalar curvature converges to a constant or diverges to infinity slowly. The distance between two signals also changes, converging eventually to a stable fixed value, provided both the number of neurons in a layer and the number of layers tend to infinity. This causes a problem: When we consider a curve in the input space, it is mapped as a continuous curve of fractal nature, but our theory contradictorily suggests that the curve eventually converges to a discrete set of equally spaced points. In reality, the numbers of neurons and layers are finite and thus, it is expected that the finite size effect causes the discrepancies between our theory and reality. Further studies are necessary to understand their implications on information processing.