RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      검색결과 좁혀 보기

      선택해제
      • 좁혀본 항목 보기순서

        • 원문유무
        • 원문제공처
        • 등재정보
        • 학술지명
          펼치기
        • 주제분류
        • 발행연도
          펼치기
        • 작성언어
        • 저자
          펼치기

      오늘 본 자료

      • 오늘 본 자료가 없습니다.
      더보기
      • 무료
      • 기관 내 무료
      • 유료
      • LoRa-based video data transmission for real-time monitoring of pig farm

        ( Nasim Reza ),( Shahriar Ahmed ),( Sumaiya Islam ),( Shaha Nur Kabir ),( Minho Song ),( Gookhwan Kim ),( Sun-ok Chung ) 한국농업기계학회 2023 한국농업기계학회 학술발표논문집 Vol.28 No.1

        This paper proposed a LoRa-based video data transmission system for real-time monitoring in a pig farm. This approach eliminates the need for complex and costly infrastructure, making it a cost-effective solution for real-time monitoring in pig farm. The system architecture included the Raspberry Pi 4B microcontroller, RGB cameras, LoRa transceivers, gateway, and cloud-based platform for data analysis and visualization. The video data was captured using the RGB cameras and stored into an external memory through the microcontroller. Then the video was segmented into small chunks and compressed as an H.265 codec, which reduced the size of the video data and made it easier to transmit using the LoRa. Each compressed video chunk was then sent by the LoRa transceiver with a low data rate and a low transmit power. This allows the transmission to reach long distances, while consuming very low power levels. At the receiving end, the video chunks were received by another LoRa transceiver and re-assembled into the original video stream. The system performance was evaluated through a series of tests, including transmission range, video quality, and power consumption. The results showed that the LoRa-based system could transmit video data over a long range (2 km) with low power consumption (less than 1 W), while maintaining good video quality (720p resolution). The findings showed a great potential for real-time monitoring in pig farms, providing valuable insights into the pigs behavior, health, and productivity.

      • Rice yield estimation based on K-means clustering with graph-cut segmentation using low-altitude UAV images

        Reza, Md Nasim,Na, In Seop,Baek, Sun Wook,Lee, Kyeong-Hwan Elsevier 2019 Biosystems engineering Vol.177 No.-

        <P><B>Abstract</B></P> <P>Predicting the harvest yield enables farm practices to be modified throughout the growing season, with potential to increase the final yield. Unmanned aerial vehicle (UAV) based remote sensing is a promising way to estimate crop yields. In this study, rice yield was estimated by segmenting grain areas using low altitude RGB images collected using a rotary-wing type UAV. In particular, an image processing method that combines K-means clustering with a graph-cut (KCG) algorithm was proposed to segment the rice grain areas. The graph-cut algorithm was applied to extract the foreground and background of the images. The foreground RGB images were converted to the Lab colour space and then K-means clustering was used to label pixels based on colour information. The area of the rice grains in the images was calculated from the clustered images. Using this grain area information, the rice yield of the field could be estimated. Experiments show that the proposed method can segment the grain areas with a relative error of 6%–33%, and it improved the relative error of the previous method (by 1%–31%). The coefficient of determination between the results of the proposed method and the ground truth was found to be 0.98. Furthermore, the relative error of the yield estimation for four field sections was 21%–31%. The results indicate that the UAV image-based grain segmentation has the potential to estimate rice yield accurately and conveniently.</P> <P><B>Highlights</B></P> <P> <UL> <LI> Rice yield estimation based on segmented rice grain area using UAV images. </LI> <LI> Improve the accuracy of segmented rice grain area by using estimated foreground objects. </LI> <LI> Estimate the rice yield during whole stage of growing season. </LI> <LI> Measure the optimal estimation stage for rice yield in the lifecycle of rice plant. </LI> </UL> </P>

      • KCI등재

        Automatic Counting of Rice Plant Numbers After Transplanting Using Low Altitude UAV Images

        Reza, Md Nasim,Na, In Seop,Lee, Kyeong-Hwan The Korea Contents Association 2017 International Journal of Contents Vol.13 No.3

        Rice plant numbers and density are key factors for yield and quality of rice grains. Precise and properly estimated rice plant numbers and density can assure high yield from rice fields. The main objective of this study was to automatically detect and count rice plants using images of usual field condition from an unmanned aerial vehicle (UAV). We proposed an automatic image processing method based on morphological operation and boundaries of the connected component to count rice plant numbers after transplanting. We converted RGB images to binary images and applied adaptive median filter to remove distortion and noises. Then we applied a morphological operation to the binary image and draw boundaries to the connected component to count rice plants using those images. The result reveals the algorithm can conduct a performance of 89% by the F-measure, corresponding to a Precision of 87% and a Recall of 91%. The best fit image gives a performance of 93% by the F-measure, corresponding to a Precision of 91% and a Recall of 96%. Comparison between the numbers of rice plants detected and counted by the naked eye and the numbers of rice plants found by the proposed method provided viable and acceptable results. The $R^2$ value was approximately 0.893.

      • Geo-located Positioning and Counting of Rice Plants in UAV images

        ( Nasim Reza ),( Xu-hua Dong ),( Sang-eon Oh ),( Kyeonghwan Lee ) 한국농업기계학회 2019 한국농업기계학회 학술발표논문집 Vol.24 No.2

        Rice plants density in the field has a great impact on it’s final yield and grain quality. Precise and timely estimates of plants position and counting provides better farming to assure highyield. But plant position and counting works are very challenging, time consuming and often rely on labor extensive. Unmanned aerial vehicle based imaging provides fast and accurate solution to check plants position and count. The objective of this study was to evaluate geo-located position and counting of rice plants using low altitude UAV images. We used three Sony alpha 5100L digital camera attached to a DJI S1000 octocopter drone. RGB images were taken from an altitude of 8m using and were used for automated feature extraction and matching to generate orthomosaic image. We designed an algorithm to identify rice plant position with coordinates to monitor during the growing period. We calculated ground sampling distance of orthomosaic image and the pixel size in each image. Then, we calculated the distance of plant from the reference position. Finally, we calculated coordinates for each plant using distance information and verified the position. The results showed high accuracy of plant positions and counting. The proposed method indicated that the plant positioning using aerial images could be an ideal technique to monitor each crop throughout the growing season.

      • Soil water content distribution mapping using an automatic monitoring system

        ( Nasim Reza ),( Eliezel Habineza ),( Rejaul Karim ),( Mohammod Ali ),( Shaha Nur Kabir ),( Young Yoon Jang ),( Sun-ok Chung ) 한국농업기계학회 2023 한국농업기계학회 학술발표논문집 Vol.28 No.1

        Soil water content plays a crucial role in plant growth, irrigation scheduling, and soil erosion prediction. Automatic sensor-based monitoring systems have emerged as efficient tools to provide continuous soil water content mapping against the traditional method, which is time consuming and limited to single point measurement. This study aimed to develop a sensor-based monitoring system for real-time mapping of soil water content distribution. To assess the variability of soil water content in sandy soil, a soil test bin of 3 m by 3 m was constructed. The system consisted of a series of sensors (SEN0193) installed at different depths, ranging from 0 to 60 cm. The monitoring system was equipped with wireless transmission technology using Arduino Mega 2560 and Raspberry Pi 4B microcontroller. Water content sensors were placed at predetermined locations and the geographic coordinates were obtained using GPS. The microcontroller collected data from the sensors, which was then evaluated using GIS to prepare a map of the soil moisture distribution. The results suggested that the monitoring system had the potential to revolutionize soil water content mapping and monitoring. The system can provide valuable insights into the spatial and temporal variations of soil water content, which can inform irrigation scheduling, crop management, and soil conservation practices.

      • Analysis of Rice Grain and Morphological Characteristics in 3D Model Generated from Multi-Camera UAV Images

        ( Md Nasim Reza ),( Sunwook Baek ),( Sang-eon Oh ),( Kyeonghwan Lee ) 한국농업기계학회 2019 한국농업기계학회 학술발표논문집 Vol.24 No.1

        Advanced crop management is an important issue in the field of precision agriculture. Rice yield estimation during crop growing season is a significant indicator and plant parameters, such as leaf area index, height, stem diameter, biomass, etc. are highly related with yield to analyze the impact of crop management practices. The traditional approaches to attain these parameters from rice field are very challenging, laborious and time-consuming. 3D geometric information based on UAV imagery of rice plant can be a relevant solution to estimate these valuable parameters. The objective of this study was to evaluate the application of the 3D model generated from low altitude UAV based multi-camera images to analyze rice grain and estimate plant height and biomass. We flied DJI S1000 octocopter with a camera bracket beneath it. The bracket was designed to hold three Sony alpha a5100 digital cameras with one nadir and two oblique views on the ground. RGB images were taken from a height of 10m using autoflight and predesigned flight path. With these images being processed into 3D point clouds, which were subsequently used to generate 3D model of rice plants. We designed an algorithm to recognize grain and measure height and biomass from the 3D model. The classification of ground and vegetation part was done to compute vegetation height. To identify rice grain in panicles, we applied regionprops and object featured segmentation. Then, we applied 3D surface plotting and voxelization to measure biomass volume. The proposed method showed a strong correlation between observed and actual measurement of rice grain count, height and biomass. Multi-camera imaging was shown to be effective at 3D modelling and estimating morphological characteristics of the rice plant.

      • Lab Color Space based Rice Yield Prediction using Low Altitude UAV Field Image

        ( Md Nasim Reza ),( Inseop Na ),( Sunwook Baek ),( In Lee ),( Kyeonghwan Lee ) 한국농업기계학회 2017 한국농업기계학회 학술발표논문집 Vol.22 No.1

        Prediction of rice yield during a growing season would be very helpful to magnify rice yield as it also allows better farm practices to maximize yield with greater profit and lesser costs. UAV imagery based automatic detection of rice can be a relevant solution for early prediction of yield. So, we propose an image processing technique to predict rice yield using low altitude UAV images. We proposed L*a*b* color space based image segmentation algorithm. All images were captured using UAV mounted RGB camera. The proposed algorithm was developed to find out rice grain area from the image background. We took RGB image and applied filter to remove noise and converted RGB image to L*a*b* color space. All color information contain in both a* and b* layers and by using k-mean clustering classification of these colors were executed. Variation between two colors can be measured and labelling of pixels was completed by cluster index. Image was finally segmented using color. The proposed method showed that rice grain could be segmented and we can recognize rice grains from the UAV images. We can analyze grain areas and by estimating area and volume we could predict rice yield.

      • Real-time sound monitoring using 2D convolutional neural network (CNN) for pig diseases symptoms detection in pig farm

        레자나심 ( Nasim Reza ),하케아스라쿨 ( Asrakul Haque ),카림레자울 ( Rejaul Karim ),송민호 ( Minho Song ),김국환 ( Gookhwan Kim ),정선옥 ( Sun-ok Chung ) 한국농업기계학회 2022 한국농업기계학회 학술발표논문집 Vol.27 No.2

        Monitoring and preventing diseases in livestock is essential for modern farming, and an early warning system may significantly reduce the economic impact of diseased events. Manual monitoring of pigs in a pig farm is time consuming and labor intensive. Automatic monitoring for pig diseases may help to control the spread of infections. The purpose of this research was to identify the signs of sickness in pigs using acoustic monitoring in real-time. Two microphones were installed in the pig farm for automatic sound acquisition. The sound signals were converted into spectrograms by fast fourier transform (FFT) and mel-frequency cepstral coefficients (MFCC) as a characteristic parameter. Using a 2D convolutional neural network (CNN) and features extracted from the spectrogram, we presented a classification approach for real-time application. The conversion of sound inputs into spectrograms made it possible to recognize by the use of CNN. For the real-time detection, the proposed algorithm showed the ability to recognize the sounds of cough, sneezing, scream, and crushing with an overall recognition accuracy of 75.8%, 69.5%, 72.4%, and 71.6%, respectively, and an average F1-score of 81.2%. Future work is needed to enhance sound detection robustness.

      • Basic performance test for sound detection and remote monitoring in Pig Farm

        레자나심 ( Nasim Reza ),초두리밀론 ( Milon Chowdhury ),키라가샤피크 ( Shafik Kiraga ),정선옥 ( Sun-ok Chung ) 한국농업기계학회 2021 한국농업기계학회 학술발표논문집 Vol.26 No.2

        Precision livestock farming is an intelligent technology, which allows the closer monitoring of each animal on farms. Sound based precision farming provides considerable benefits compared to other technology, such as imaging sensor, motion sensors, etc. In addition, sound sensors are inexpensive, no direct contact, and a huge number of animal can be observed using a single sensor. The objective of this study was to investigate a remotely monitored sound detection and imaging system in pig farm for early detection of respiratory diseases. Three microphones and three RGB cameras with three micro-controller were used to receive the sound and image data in the pig farm. Total 30 pigs were covered by our surveillance system. A sound analysis algorithm was developed to record the sounds received by the microphones and distinguished the pig sounds from the outside noises. The sound was then processed by the algorithm to detect the abnormal sound of pigs. The images were synchronized and used to monitor the unwanted movement and behaviour. High, medium, and low frequency sounds were detected. The results showed that the detection efficiency for high frequency sound was around 85%, and for low frequency sound was 73%. Moreover, movement of pigs were also monitored by images. From this study, it would be feasible to recognize early respiratory illness in pigs through automated and sequential monitoring of sounds and images within the pig farm.

      • Recognition and detection of pig posture based on instance segmentation and computer vision in pig farm

        레자나심 ( Nasim Reza ),카비르사자둘 ( Sazzadul Kabir ),하케아스라쿨 ( Asrakul Haque ),정선옥 ( Sun-ok Chung ) 한국농업기계학회 2022 한국농업기계학회 학술발표논문집 Vol.27 No.1

        Pig posture changes throughout the growing period are most often indicators to illness. Monitoring pigs postural movements enables us to identify morphological changes in pigs early and to detect potential risk factors for pig health. Large-scale pig farming requires extensive manual monitoring by pig farmers, which is time-consuming and laborious. Computer vision-based monitoring of posture activities over time may help to limit the spread of disease infections. The objective of this study was to recognize and detect pig posture using an masked based instance segmentation in the pig farm. Two automatic video acquisition systems were installed from top and side view, respectively. RGB images were extracted from the RGB video files and used for annotation work. Manual annotation of 200 images were prepared as training dataset, including the three postures: standing, lying, and eating from bin. An instance segmentation framework was employed to recognize and detect pig posture. A region proposal network is used in the first stage of the masked R-CNN based instance segmentation procedure. It obtains features from candidate boxes using RoIPool and conducts classification and bounding-box regression in the second step. Proposed method was evaluated using test image dataset and the experimental results showed the proposed framework obtained a F1 score of 0.911. Our work investigated a new way for recognizing and detecting pig posture in pig farm, which enables useful research into vision-based, real-time automated pig monitoring and diseases assessment.

      연관 검색어 추천

      이 검색어로 많이 본 자료

      활용도 높은 자료

      해외이동버튼