http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
유준상(Joonsang Yu),강성범(Sungbum Kang),최기영(Kiyoung Choi) 대한전자공학회 2018 대한전자공학회 학술대회 Vol.2018 No.11
This paper analyzes the knowledge distillation method from several perspectives. The knowledge distillation is commonly used to train a small network (student network) by preserving the accuracy of a large network (teacher network). In this paper, we investigate different types of knowledge distillation: from a smaller or the same teacher network instead of a larger teacher network. Moreover, we also cover the knowledge distillation for different network architecture. Even for a student network of the same size or larger, our experimental results show that the knowledge distillation achieves higher accuracy compared with back-propagation from the scratch.