RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      검색결과 좁혀 보기

      선택해제

      오늘 본 자료

      • 오늘 본 자료가 없습니다.
      더보기
      • 무료
      • 기관 내 무료
      • 유료
      • KCI등재

        영상-관성-거리 센서 융합 기반의 다중 무인기 위치 추정

        최준호,Kevin Christiansen Marsim,정명우,류기환,김지원,명현 제어·로봇·시스템학회 2023 제어·로봇·시스템학회 논문지 Vol.29 No.11

        . Multi-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements.

      연관 검색어 추천

      이 검색어로 많이 본 자료

      활용도 높은 자료

      해외이동버튼