http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
로봇의 표정과 움직임을 동시에 제어 가능한 동적 표현모델 개발
박하은,이지연,Temirlan Dzhoroev,김병헌,이희승 제어·로봇·시스템학회 2024 제어·로봇·시스템학회 논문지 Vol.30 No.1
Social robots commonly rely on facial expressions and gestures to convey emotions. However, many robots follow a predetermined sequence, executing a set of facial animations and movement sequences once an emotion is identified. This rigid approach can lead to unnatural processing when confronted with additional stimuli during an ongoing emotional expression. This may cause the robot to ignore the new stimulus until the emotion is fully expressed, or to abruptly move on to the next one. To address this limitation, we implemented an emotion engine with a linear dynamic affect-expression model (LDAEM) that calculates the emotion based on stimuli and determines the corresponding facial expression and robot movements. By leveraging the Ekman 6 basic emotions, our emotion engine incorporates 12 control points (CPs) for facial expression and 3 CPs for movement. Experimental results demonstrate the dynamic adaptation of emotions to stimuli. Notably, our approach allows for smooth transitions between emotions, even when different emotional stimuli are introduced during an ongoing emotional expression. Moreover, it can be seamlessly applied to other robotic systems, offering a versatile framework for emotional expression.