著者
椿 真史 新保 仁 松本 裕治
出版者
一般社団法人 人工知能学会
雑誌
人工知能学会論文誌 (ISSN:13460714)
巻号頁・発行日
vol.31, no.2, pp.O-FA2_1-10, 2016-03-01 (Released:2016-06-09)
参考文献数
40

The notion of semantic similarity between text data (e.g., words, phrases, sentences, and documents) plays an important role in natural language processing (NLP) applications such as information retrieval, classification, and extraction. Recently, word vector spaces using distributional and distributed models have become popular. Although word vectors provide good similarity measures between words, phrasal and sentential similarities derived from composition of individual words remain as a difficult problem. To solve the problem, we focus on representing and learning the semantic similarity of sentences in a space that has a higher representational power than the underlying word vector space. In this paper, we propose a new method of non-linear similarity learning for compositionality. With this method, word representations are learnedthrough the similarity learning of sentences in a high-dimensional space with implicit kernel functions, and we can obtain new word epresentations inexpensively without explicit computation of sentence vectors in the high-dimensional space. In addition, note that our approach differs from that of deep learning such as recursive neural networks (RNNs) and long short-term memory (LSTM). Our aim is to design a word representation learning which combines the embedding sentence structures in a low-dimensional space (i.e., neural networks) with non-linear similarity learning for the sentence semantics in a high-dimensional space (i.e., kernel methods). On the task of predicting the semantic similarity of two sentences (SemEval 2014, task 1), our method outperforms linear baselines, feature engineering approaches, RNNs, and achieve competitive results with various LSTM models.

言及状況

外部データベース (DOI)

Twitter (4 users, 4 posts, 0 favorites)

NowBrowsing: _pdf: https://t.co/TH10N4becm
https://t.co/OcTtLwxoI4
1 1 https://t.co/4xw2VZuHHZ
参考文献が全部 "?" に… → https://t.co/KFiSzy2pwO

収集済み URL リスト