RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      검색결과 좁혀 보기

      선택해제

      오늘 본 자료

      • 오늘 본 자료가 없습니다.
      더보기
      • 무료
      • 기관 내 무료
      • 유료
      • HSKD: Hidden State Knowledge Distillation을 이용한 배터리 SoC 예측 모델 경량화

        강수혁(Soohyeok Kang),박규도(Guydo Park),심동훈(Donghoon Sim),조선영(Sunyoung Cho) 한국자동차공학회 2023 한국자동차공학회 학술대회 및 전시회 Vol.2023 No.11

        Knowledge Distillation (KD) is one of the representative methods for AI model compression, where a student model learns by imitating the output of a teacher model. The student model has a smaller network than the teacher model, which can reduce inference time and save memory. This method should be applied for efficient AI model inference in limited computing environments, such as the vehicle controller. In this paper, we applied the Hidden State Knowledge Distillation (HSKD) method to a Bi-LSTM (Bidirectional Long Short Term Memory) model for predicting the State of Charge (SoC) of an electric vehicle battery. This model predicts the SoC 5 minutes ahead using the SoC of the past 5 minutes. In the experiment, we selected a teacher model with a hidden size of 1,024, which showed the highest accuracy, and compared the performance of hidden state knowledge distillation and general knowledge distillation models for models with a hidden size smaller than 1,024. And, we measured the inference time of the compressed models on controllers equipped with ARM Cortex-A53. As a result, the model with a hidden size of 32 had a loss of 0.008 in terms of R2 score compared to the teacher model, but the inference time was reduced by approximately 20.1x and the file size was compressed by 750.6x from 33,028 [KB] to 44 [KB].

      연관 검색어 추천

      이 검색어로 많이 본 자료

      활용도 높은 자료

      해외이동버튼