1 "WordSim353"
2 "Twitter-korean-text"
3 "Tensorflow"
4 Sanghyuk Choi, "On Word Embedding Models and Parameters Optimized for Korean" (15) : 2016
5 "Naver sentiment movie corpus"
6 Cui, Qing, "Knet: A general framework for learning word embedding using morphological knowledge" 34 (34): 2015
7 Levy, Omer, "Improving distributional similarity with lessons learned from word embeddings" 3 : 211-225, 2015
8 Pennington, Jeffrey, "Glove: Global Vectors for Word Representation" 14 : 1532-1543, 2014
9 Schnabel, Tobias, "Evaluation methods for unsupervised word embeddings" 298-307, 2015
10 Bojanowski, Piotr, "Enriching word vectors with subword information"
1 "WordSim353"
2 "Twitter-korean-text"
3 "Tensorflow"
4 Sanghyuk Choi, "On Word Embedding Models and Parameters Optimized for Korean" (15) : 2016
5 "Naver sentiment movie corpus"
6 Cui, Qing, "Knet: A general framework for learning word embedding using morphological knowledge" 34 (34): 2015
7 Levy, Omer, "Improving distributional similarity with lessons learned from word embeddings" 3 : 211-225, 2015
8 Pennington, Jeffrey, "Glove: Global Vectors for Word Representation" 14 : 1532-1543, 2014
9 Schnabel, Tobias, "Evaluation methods for unsupervised word embeddings" 298-307, 2015
10 Bojanowski, Piotr, "Enriching word vectors with subword information"
11 Mikolov, Tomas, "Efficient estimation of word representations in vector space"
12 Baroni, Marco, "Don't count, predict! A systematic comparison of context-counting vs. context-predicting semantic vectors" 1 : 238-247, 2014
13 Mikolov, Tomas, "Distributed representations of words and phrases and their compositionality" 3111-3119, 2013
14 Kim, Yoon, "Convolutional neural networks for sentence classification"
15 Lazaridou, Angeliki, "Compositionally Derived Representations of Morphologically Complex Words in Distributional Semantics" (1) : 1517-1526, 2013
16 양희정, "A Study on Word Vector Models for Representing Korean Semantic Information" 한국음성학회 7 (7): 41-47, 2015