著者
Thanh Vo NHU Hideyuki SAWADA
出版者
The Society of Instrument and Control Engineers
雑誌
SICE Journal of Control, Measurement, and System Integration (ISSN:18824889)
巻号頁・発行日
vol.9, no.6, pp.251-256, 2016 (Released:2016-12-28)
参考文献数
14
被引用文献数
1 1

The authors are developing a talking robot based on the physical model of human vocal organs in order to reproduce human speech mechanically. This study focuses on developing a real-time interface to control and visualize talking behavior. The talking robot has been trained using a self-organizing map (SOM) to reproduce human sounds; however, due to the nonlinear characteristics of sound dynamics, automatic generation of human-like expressive speech is difficult. It is important to visualize its performance and manually adjust the motions of the artificial vocal system to get a better result, especially when it learns to vocalize a new language. Therefore, a real- time interactive control for the talking robot is designed and developed to fulfill this task. A novel formula about the formant frequency change due to vocal tract motor movements is derived from acoustic resonance theory. In the first part of the paper, the construction of the talking robot is briefly described, followed by the real-time interaction system using Matlab Graphic User Interface (GUI) together with the strategy to interactively modify the speech articulation based on the formant frequency comparison.