http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Hausdorff Distance와 이미지정합 알고리듬을 이용한 물체인식
김동기,이완재,강이석,Kim, Dong-Gi,Lee, Wan-Jae,Gang, Lee-Seok 대한기계학회 2001 大韓機械學會論文集A Vol.25 No.5
The pixel information of the object was obtained sequentially and pixels were clustered to a label by the line labeling method. Feature points were determined by finding the slope for edge pixels after selecting the fixed number of edge pixels. The slope was estimated by the least square method to reduce the detection error. Once a matching point was determined by comparing the feature information of the object and the pattern, the parameters for translation, scaling and rotation were obtained by selecting the longer line of the two which passed through the matching point from left and right sides. Finally, modified Hausdorff Distance has been used to identify the similarity between the object and the given pattern. The multi-label method was developed for recognizing the patterns with more than one label, which performs the modified Hausdorff Distance twice. Experiments have been performed to verify the performance of the proposed algorithm and method for simple target image, complex target image, simple pattern, and complex pattern as well as the partially hidden object. It was proved via experiments that the proposed image matching algorithm for recognizing the object had a good performance of matching.
김동기,윤광익,윤지섭,강이석,Kim, Dong-Gi,Yun, Gwang-Ik,Yun, Ji-Seop,Gang, Lee-Seok 제어로봇시스템학회 2001 제어·로봇·시스템학회 논문지 Vol.7 No.8
This paper presents a sensor fusion method to recognize a cylindrical object a CCD camera, a laser slit beam and ultrasonic sensors on a pan/tilt device. For object recognition with a vision sensor, an active light source projects a stripe pattern of light on the object surface. The 2D image data are transformed into 3D data using the geometry between the camera and the laser slit beam. The ultrasonic sensor uses an ultrasonic transducer array mounted in horizontal direction on the pan/tilt device. The time of flight is estimated by finding the maximum correlation between the received ultrasonic pulse and a set of stored templates - also called a matched filter. The distance of flight is calculated by simply multiplying the time of flight by the speed of sound and the maximum amplitude of the filtered signal is used to determine the face angle to the object. To determine the position and the radius of cylindrical objects, we use a statistical sensor fusion. Experimental results show that the fused data increase the reliability for the object recognition.