http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
IEA(Iterative Error Analysis)와 분광혼합분석기법을 이용한 초분광영상의 변화탐지
송아람 ( Ahram Song ),최재완 ( Jaewan Choi ),장안진 ( Anjin Chang ),김용일 ( Yongil Kim ) 대한원격탐사학회 2015 大韓遠隔探査學會誌 Vol.31 No.5
초분광영상을 이용한 변화탐지 기법으로는 Chronochrome(CC), Principal Component Analysis (PCA), 분광혼합분석(spectral unmixing) 등이 있다. 특히, 분광혼합분석을 이용한 변화탐지는 변화객체의 위치 정보뿐만 아니라 변화의 속성까지 분석할 수 있다는 점에서 매우 효과적이나, 분광혼합분석을 활용한 초분광영상의 변화탐지 연구는 여전히 초기단계에 머물러 있다. 본 연구에서는 분광혼합분석을 이용한 효과적인 변화탐지를 위하여 Iterative Error Analysis(IEA)와 Spectral Angle Mapper(SAM) 등을 활용하여두 영상에서 변화지역을 설명할 수 있는 동일한 endmember를 결정하였으며, 점유비율의 차영상을 통하여 변화지역을 추출하였다. 제안기법의 적용성을 평가하기 위하여 임의의 변화지역을 포함한 Compact Airborne Spectrographic Imager(CASI) 및 Hyperion 모의영상에 대한 변화탐지를 수행하였다. 실험결과, 제안기법이 기존의 CC, PCA, N-FINDR를 이용한 분광혼합분석보다 효과적으로 변화지역을 추출할 수 있는 것으로 나타났다. 본 연구의 제안기법은 사전정보 없이 자동으로 동일한 endmember를 추출할 수 있는 장점을 갖기 때문에 다양한 피복물질로 구성된 영상의 변화탐지에 효과적으로 활용될 것이다. Various algorithms such as Chronochrome(CC), Principle Component Analysis(PCA), and spectral unmixing have been studied for hyperspectral change detection. Change detection by spectral unmixing offers useful information on the nature of the change compared to the other change detection methods which provide only the locations of changes in the scene. However, hyperspectral change detection by spectral unmixing is still in an early stage. This research proposed a new approach to extract endmembers, which have identical properties in temporally different images, by Iterative Error Analysis (IEA) and Spectral Angle Mapper(SAM). The change map obtained from the difference of abundance efficiently showed the changed pixels. Simulated images generated from Compact Airborne Spectrographic Imager (CASI) and Hyperion were used for change detection, and the experimental results showed that the proposed method performed better than CC, PCA, and spectral unmixing using N-FINDR. The proposed method has the advantage of automatically extracting endmembers without prior information, and it could be applicable for the real images composed of many materials.
Jaewan Choi,Junho Yeom,Anjin Chang,Younggi Byun,Yongil Kim IEEE 2013 IEEE geoscience and remote sensing letters Vol.10 No.3
<P>Most pansharpened images from existing algorithms are apt to present a tradeoff relationship between the spectral preservation and the spatial enhancement. In this letter, we developed a hybrid pansharpening algorithm based on primary and secondary high-frequency information injection to efficiently improve the spatial quality of the pansharpened image. The injected high-frequency information in our algorithm is composed of two types of data, i.e., the difference between panchromatic and intensity images, and the Laplacian filtered image of high-frequency information. The extracted high frequencies are injected by the multispectral image using the local adaptive fusion parameter and postprocessing of the fusion parameter. In the experiments using various satellite images, our results show better spatial quality than those of other fusion algorithms while maintaining as much spectral information as possible.</P>
Yeji, Kim,Jaewan, Choi,Anjin, Chang,Yongil, Kim Korean Society of Surveying 2015 한국측량학회지 Vol.33 No.3
The analysis of remote sensing data depends on sensor specifications that provide accurate and consistent measurements. However, it is not easy to establish confidence and consistency in data that are analyzed by different sensors using various radiometric scales. For this reason, the cross-calibration method is used to calibrate remote sensing data with reference image data. In this study, we used an airborne hyperspectral image in order to calibrate a multispectral image. We presented an automatic cross-calibration method to calibrate a multispectral image using hyperspectral data and spectral mixture analysis. The spectral characteristics of the multispectral image were adjusted by linear regression analysis. Optimal endmember sets between two images were estimated by spectral mixture analysis for the linear regression analysis, and bands of hyperspectral image were aggregated based on the spectral response function of the two images. The results were evaluated by comparing the Root Mean Square Error (RMSE), the Spectral Angle Mapper (SAM), and average percentage differences. The results of this study showed that the proposed method corrected the spectral information in the multispectral data by using hyperspectral data, and its performance was similar to the manual cross-calibration. The proposed method demonstrated the possibility of automatic cross-calibration based on spectral mixture analysis.
Automated Coregistration of Multisensor Orthophotos Generated from Unmanned Aerial Vehicle Platforms
Han, Youkyung,Choi, Jaewan,Jung, Jinha,Chang, Anjin,Oh, Sungchan,Yeom, Junho Hindawi Limited 2019 Journal of sensors Vol.2019 No.-
<P>Image coregistration is a key preprocessing step to ensure the effective application of very-high-resolution (VHR) orthophotos generated from multisensor images acquired from unmanned aerial vehicle (UAV) platforms. The most accurate method to align an orthophoto is the installation of air-photo targets at a test site prior to flight image acquisition, and these targets were used as ground control points (GCPs) for georeferencing and georectification. However, there are time and cost limitations related to installing the targets and conducting field surveys on the targets during every flight. To address this problem, this paper presents an automated coregistration approach for orthophotos generated from VHR images acquired from multisensors mounted on UAV platforms. Spatial information from the orthophotos, provided by the global navigation satellite system (GNSS) at each image’s acquisition time, is used as ancillary information for phase correlation-based coregistration. A transformation function between the multisensor orthophotos is then estimated based on conjugate points (CPs), which are locally extracted over orthophotos using the phase correlation approach. Two multisensor datasets are constructed to evaluate the proposed approach. These visual and quantitative evaluations confirm the superiority of the proposed method.</P>