http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
채널 프루닝과 전이 학습을 이용한 경량 DNN 모델 개발
Sara Sualiheen,Deok-Hwan Kim 한국차세대컴퓨팅학회 2023 한국차세대컴퓨팅학회 학술대회 Vol.2023 No.06
Deep neural networks (DNNs) have been widely used in various applications, however, the computational complexity and memory requirements of DNNs are becoming increasingly challenging, especially in resource-constrained devices such as mobile phones and embedded systems. In this paper, we propose a lightweight DNN model using channel pruning to address the computational complexity and memory requirements of DNNs in resource-constrained devices. Our approach combines channel pruning with transfer learning to maintain accuracy. Evaluation on the CIFAR-10 dataset shows improved performance with 78% test accuracy, 89% train accuracy, and 73% validation accuracy compared to the unpruned model. The pruned model is suitable for applications with limited computational resources.