Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/32250
Appears in Collections:Computing Science and Mathematics Journal Articles
Peer Review Status: Refereed
Title: Semantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery
Author(s): Osco, Lucas Prado
Nogueira, Keiller
Marques Ramos, Ana Paula
Faita Pinheiro, Mayara Maezano
Furuya, Danielle Elis Garcia
Gonçalves, Wesley Nunes
de Castro Jorge, Lucio Andre
Marcato Junior, Jose
dos Santos, Jefersson Alex
Contact Email: keiller.nogueira@stir.ac.uk
Keywords: Convolutional neural network
Remote sensing
Thematic map
Issue Date: 2-Jan-2021
Date Deposited: 5-Feb-2021
Citation: Osco LP, Nogueira K, Marques Ramos AP, Faita Pinheiro MM, Furuya DEG, Gonçalves WN, de Castro Jorge LA, Marcato Junior J & dos Santos JA (2021) Semantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery. Precision Agriculture. https://doi.org/10.1007/s11119-020-09777-5
Abstract: Accurately mapping farmlands is important for precision agriculture practices. Unmanned aerial vehicles (UAV) embedded with multispectral cameras are commonly used to map plants in agricultural landscapes. However, separating plantation fields from the remaining objects in a multispectral scene is a difficult task for traditional algorithms. In this connection, deep learning methods that perform semantic segmentation could help improve the overall outcome. In this study, state-of-the-art deep learning methods to semantic segment citrus-trees in multispectral images were evaluated. For this purpose, a multispectral camera that operates at the green (530–570 nm), red (640–680 nm), red-edge (730–740 nm) and also near-infrared (770–810 nm) spectral regions was used. The performance of the following five state-of-the-art pixelwise methods were evaluated: fully convolutional network (FCN), U-Net, SegNet, dynamic dilated convolution network (DDCN) and DeepLabV3 + . The results indicated that the evaluated methods performed similarly in the proposed task, returning F1-Scores between 94.00% (FCN and U-Net) and 94.42% (DDCN). It was also determined the inference time needed per area and, although the DDCN method was slower, based on a qualitative analysis, it performed better in highly shadow-affected areas. This study demonstrated that the semantic segmentation of citrus orchards is highly achievable with deep neural networks. The state-of-the-art deep learning methods investigated here proved to be equally suitable to solve this task, providing fast solutions with inference time varying from 0.98 to 4.36 min per hectare. This approach could be incorporated into similar research, and contribute to decision-making and accurate mapping of plantation fields.
DOI Link: 10.1007/s11119-020-09777-5
Rights: This item has been embargoed for a period. During the embargo please use the Request a Copy feature at the foot of the Repository record to request a copy directly from the author. You can only request a copy if you wish to use this work for your own research or private study. This is a post-peer-review, pre-copyedit version of an article published in Precision Agriculture. The final authenticated version is available online at: https://doi.org/10.1007/s11119-020-09777-5
Notes: Output Status: Forthcoming/Available Online

Files in This Item:
File Description SizeFormat 
SemanticSeg_Laranja.pdfFulltext - Accepted Version13.14 MBAdobe PDFUnder Embargo until 2022-01-03    Request a copy

Note: If any of the files in this item are currently embargoed, you can request a copy directly from the author by clicking the padlock icon above. However, this facility is dependent on the depositor still being contactable at their original email address.



This item is protected by original copyright



Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.