http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
양진철(Jin-Cheol Yang),강석주(Suk-Ju Kang) 대한전자공학회 2023 대한전자공학회 학술대회 Vol.2023 No.11
In recent years, the performance and applications of transform neural networks in vision tasks have grown phenomenally, outperforming convolutional neural network (CNN)-based models in many cases. However, ViT-based models often have larger model sizes and more parameters, making them challenging to deploy on resourceconstrained edge devices. This necessitates model optimization techniques, such as model quantization. In this paper, we cover model quantization among model optimization techniques, focusing on how it differs from CNN-based model quantization and how ViT-based model quantization has evolved.