http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
우수연(Soo-Yeon Woo),정세종(Se-Jong Jeong),이경운(Kyungwoon Lee) 대한전자공학회 2023 대한전자공학회 학술대회 Vol.2023 No.6
This paper analyzes and evaluates recent machine learning (ML)-based estimation techniques for predicting computing resource usage. Even though recent techniques allow cloud administrators to manage cloud computing resources more efficiently, they require a significant amount of computing resources, such as GPUs, for training ML models. As such powerful hardware is limited in Edge computing environments that provide low-latency data processing for the Internet of Things (IoT) devices, the ML models can require a long training time while reducing the accuracy. Our evaluation results show that the training time increases by 28 times in the Edge computing environment compared to that of the server with a GPU. This result points out that recent ML-based computing resource estimation techniques require further optimization to reduce the training time when we utilize the models in the Edge computing environment.