<conference paper>
Contextualized Word Representations for Multi-Sense Embedding

Creator
Language
Publisher
Date
Source Title
Vol
Issue
First Page
Last Page
Conference
Publication Type
Access Rights
Rights
Related DOI
Related URI
Related HDL
Abstract Distributed word representations are used in many natural language processing tasks. When dealing with ambiguous words, it is desired to generate multi-sense embeddings, i.e., multiple representations... per word. Therefore, several methods have been proposed to generate different word representations based on parts of speech or topic, but these methods tend to be too coarse to deal with ambiguity. In this paper, we propose methods to generate multiple word representations for each word based on dependency structure relations. In order to deal with the data sparseness problem due to the increase in the size of vocabulary, the initial value for each word representations is determined using pre-trained word representations. It is expected that the representations of low frequency words will remain in the vicinity of the initial value, which will in turn reduce the negative effects of data sparseness. Extensive evaluation results confirm the effectiveness of our methods that significantly outperformed state-of-the-art methods for multi-sense embeddings. Detailed analysis of our method shows that the data sparseness problem is resolved due to the pre-training.show more

Hide fulltext details.

pdf 2244110 pdf 460 KB 290  

Details

EISSN
Record ID
Related URI
Created Date 2019.06.06
Modified Date 2023.08.03

People who viewed this item also viewed