RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      검색결과 좁혀 보기

      선택해제

      오늘 본 자료

      • 오늘 본 자료가 없습니다.
      더보기
      • 무료
      • 기관 내 무료
      • 유료
      • KCI등재

        Hand Gesture Control for Human–Computer Interaction with Deep Learning

        Chua S. N. David,Chin K. Y. Richard,Lim S. F.,Jain Pushpdant 대한전기학회 2022 Journal of Electrical Engineering & Technology Vol.17 No.3

        The use of gesture control has numerous advantages compared to the use of physical hardware. However, it has yet to gain popularity as most gesture control systems require extra sensors or depth cameras to detect or capture the movement of gestures before a meaningful signal can be triggered for corresponding course of action. This research proposes a method for a hand gesture control system with the use of an object detection algorithm, YOLOv3, combined with handcrafted rules to achieve dynamic gesture control on the computer. This project utilizes a single RGB camera for hand gesture recognition and localization. The dataset of all gestures used for training and its corresponding commands, are custom designed by the authors due to the lack of standard gestures specifi cally for human–computer interaction. Algorithms to integrate gesture commands with virtual mouse and keyboard input through the Pynput library in Python, were developed to handle commands such as mouse control, media control, and others. The mAP result of the YOLOv3 model obtained 96.68% accuracy based on testing result. The use of rule-based algorithms for gesture interpretation was successfully implemented to transform static gesture recognition into dynamic gesture.

      연관 검색어 추천

      이 검색어로 많이 본 자료

      활용도 높은 자료

      해외이동버튼