http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
송종윤(Jongyoon Song),윤성로(Sungroh Yoon) 대한전자공학회 2018 대한전자공학회 학술대회 Vol.2018 No.11
Recently, neural networks achieved high performance in neural machine translation domain. Self-attention based models have better performance than convolutional neural networks (CNNs) or recurrent neural networks (RNNs). A result of self-attention over several layers resembles a tree structure of the sentence. In this paper, we analyze attention weights of self-attention based neural machine translation model, and compare with the tree structure of input sentence, and investigate redundancy or non-intuitiveness.