소나무재선충병은 동남부 지역을 중심으로 국내에 급속히 확산되고 있는 수목병해이다. 본 연구는 소나무재선충병 항공예찰의 효율성과 정확성을 파악하기 위해 현장예찰 조사와 드론을 이용하여 항공사진 조사를 비교하였다. 소나무재선충병 감염목과 건전목 구분을 위해 포항과 밀양의 소나무재선충 병 발생지역에서 저고도 항공사진을 촬영 후, RGB값, 분광식생지수(Spectral Vegetation Indices), M-통계값을 비교하였다. 그 결과 항공사진의 적색광 (R)값이 소나무재선충 감염목에서 가장 높게 나타났고, 건전목에서는 녹색광(G)값이 가장 높게 나타났으며, 감염목과 건전목의 RGB값의 차이는 적색광(R) 값이 가장 크게 변동하였고, 통계적으로도 유의하였다. RGB값을 이용한 분광식생지수 Excess red (ExR), R-G, Color index of vegetation (CIVE), Woebbecke index (WI)는 소나무재선충 감염목과 건전목에서 통계적으로 유의한 차이를 보였다. 분광식생지수의 M-통계값은 포항(2.3-3.1)과 밀양 (2.6-3.3) 모두 ExR, R-G, CIVE, WI에서 기준값인 1보다 높게 나타났다. 본 연구결과는 드론으로 촬영한 저고도 항공사진의 RBG값과 분광식생지수 분석을 통해 소나무재선충의 감염목과 건전목의 구분이 가능하며, 소나무재선충병의 항공예찰을 위한 기초자료로 활용할 수 있을 것으로 판단된다.
Ecological disturbance plants distributed throughout the country are causing a lot of damage to us directly or indirectly in terms of ecology, economy and health. These plants are not easy to manage and remove because they have a strong fertility, and it is very difficult to express them quantitatively. In this study, drone hyperspectral sensor data and Field spectroradiometer were acquired around the experimental area. In order to secure the quality accuracy of the drone hyperspectral image, GPS survey was performed, and a location accuracy of about 17cm was secured. Spectroscopic libraries were constructed for 7 kinds of plants in the experimental area using a Field spectroradiometer, and drone hyperspectral sensors were acquired in August and October, respectively. Spectral data for each plant were calculated from the acquired hyperspectral data, and spectral angles of 0.08 to 0.36 were derived. In most cases, good values of less than 0.5 were obtained, and Ambrosia trifida and Lactuca scariola, which are common in the experimental area, were extracted. As a result, it was found that about 29.6% of Ambrosia trifida and 31.5% of Lactuca scariola spread in October than in August. In the future, it is expected that better results can be obtained for the detection of ecosystem distribution plants if standardized indicators are calculated by constructing a precise spectral angle standard library based on more data.
As drones gain more popularity these days, drone detection becomes more important part of the drone systems for safety, privacy, crime prevention and etc. However, existing drone detection systems are expensive and heavy so that they are only suitable for industrial or military purpose. This paper proposes a novel approach for training Convolutional Neural Networks to detect drones from images that can be used in embedded systems. Unlike previous works that consider the class probability of the image areas where the class object exists, the proposed approach takes account of all areas in the image for robust classification and object detection. Moreover, a novel loss function is proposed for the CNN to learn more effectively from limited amount of training data. The experimental results with various drone images show that the proposed approach performs efficiently in real drone detection scenarios.