著者
四辻 嵩直 赤間 啓之
出版者
日本認知科学会
雑誌
認知科学 (ISSN:13417924)
巻号頁・発行日
vol.30, no.4, pp.465-478, 2023-12-01 (Released:2023-12-15)
参考文献数
69

The neural basis of our language comprehension system has been explored using neuroimaging techniques, such as functional magnetic resonance imaging. Despite having identified brain regions and systems related to various linguistic information aspects, the entire image of a neurocomputational model of language comprehension remains unsolved. Contrastingly, in machine learning, the rapid development of natural language models using deep learning allowed sentence generation models to generate high-accuracy sentences. Mainly, this study aimed to build a method that reconstructs stimulus sentences directly only from neural representations to evaluate a neurocomputational model for understanding linguistic information using these text generation models. Consequently, the variational autoencoder model combined with pre-trained deep neural network models showed the highest decoding accuracy, and we succeeded in reconstructing stimulus sentences directly only from neural representations using this model. Although we only achieved topic-level sentence generation, we still exploratorily analyzed the characteristics of neural representations in language comprehension, considering this model as a neurocomputational model.

言及状況

外部データベース (DOI)

Twitter (6 users, 6 posts, 3 favorites)

【認知科学・研究論文】四辻・赤間 (2023). 文章生成深層モデルによる刺激文の再構成 本研究では,文章生成深層モデルを活用し,脳神経活動のみを用いた刺激文の再構成手法とその評価方法を提案した. https://t.co/qydzGmupeM

収集済み URL リスト