- 著者
-
Keisuke IMOTO
- 出版者
- The Institute of Electronics, Information and Communication Engineers
- 雑誌
- IEICE Transactions on Information and Systems (ISSN:09168532)
- 巻号頁・発行日
- vol.E103.D, no.3, pp.631-638, 2020-03-01 (Released:2020-03-01)
- 参考文献数
- 35
- 被引用文献数
-
3
In this paper, we propose an effective and robust method of spatial feature extraction for acoustic scene analysis utilizing partially synchronized and/or closely located distributed microphones. In the proposed method, a new cepstrum feature utilizing a graph-based basis transformation to extract spatial information from distributed microphones, while taking into account whether any pairs of microphones are synchronized and/or closely located, is introduced. Specifically, in the proposed graph-based cepstrum, the log-amplitude of a multichannel observation is converted to a feature vector utilizing the inverse graph Fourier transform, which is a method of basis transformation of a signal on a graph. Results of experiments using real environmental sounds show that the proposed graph-based cepstrum robustly extracts spatial information with consideration of the microphone connections. Moreover, the results indicate that the proposed method more robustly classifies acoustic scenes than conventional spatial features when the observed sounds have a large synchronization mismatch between partially synchronized microphone groups.