Back close

Identifying epiphytes in drones photos with a conditional generative adversarial network (C-GAN)

Publication Type : Conference Paper

Publisher : ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences

Source : ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, p.99-104 (2020)

Url : https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLIV-M-2-2020/99/2020/

Keywords : CNN, Image translation, PIX2PIX, Segmentation, UAV, UNET

Campus : Coimbatore

School : School of Engineering

Center : Computational Engineering and Networking

Department : Electronics and Communication

Year : 2020

Abstract : Unmanned Aerial Vehicle (UAV) missions often collect large volumes of imagery data. However, not all images will have useful information, or be of sufficient quality. Manually sorting these images and selecting useful data are both time consuming and prone to interpreter bias. Deep neural network algorithms are capable of processing large image datasets and can be trained to identify specific targets. Generative Adversarial Networks (GANs) consist of two competing networks, Generator and Discriminator that can analyze, capture, and copy the variations within a given dataset. In this study, we selected a variant of GAN called Conditional-GAN that incorporates an additional label parameter, for identifying epiphytes in photos acquired by a UAV in forests within Costa Rica. We trained the network with 70%, 80%, and 90% of 119 photos containing the target epiphyte, Werauhia kupperiana (Bromeliaceae) and validated the algorithm’s performance using a validation data that were not used for training. The accuracy of the output was measured using structural similarity index measure (SSIM) index and histogram correlation (HC) coefficient. Results obtained in this study indicated that the output images generated by C-GAN were similar (average SSIM = 0.89–0.91 and average HC 0.97–0.99) to the analyst annotated images. However, C-GAN had difficulty to identify when the target plant was away from the camera, was not well lit, or covered by other plants. Results obtained in this study demonstrate the potential of C-GAN to reduce the time spent by botanists to identity epiphytes in images acquired by UAVs.

Cite this Research Publication : Shashank Anivilla, Sajith Variyar V. V., Sowmya V., Dr. Soman K. P., Ramesh Sivanpillai, and Greg Brown, “Identifying epiphytes in drones photos with a conditional generative adversarial network (C-GAN)”, in ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 2020, pp. 99-104.

Admissions Apply Now