Skip to main content

A sweet success

By [email protected] - 26th February 2019 - 15:49

The Lindcove Research and Extension Center (LREC), a part of the University of California (UC) Division of Agriculture and Natural Resources, has deep roots in the US state’s citrus industry. Spanning from the San Joaquin Valley – where 75% of the state’s citrus is grown – to the Sierra Nevada foothills, the 70ha LREC has been a critical link between cutting-edge citrus research and growers since 1959.

In California, citrus is gold. Valued at more than US$2bn annually, the citrus industry is a bedrock of the state’s agriculture economy and the lifeblood for more than 3,000 growers who farm 130,000 acres of citrus.

It’s a good bet that the root of any given tree on one of those farms can be traced back to the LREC, which today manages nearly 600 tree crop varieties – mostly citrus species – of mixed ages and sizes. And managers have continually adopted management methods and tools that will ensure the crops are available to hundreds of scientists and educators from all over the world.

One recent innovation was producing a reference tree database of the location and attribute data of 2,912 individual trees using 1m, 2012 imagery from the US Department of Agriculture’s National Aerial Imagery Program (NAIP). But in 2017, they began to wonder how UAV imagery could benefit them.

At the same time, Ovidiu Csillik, a visiting research scholar at UC Berkeley’s Department of Environmental Science, Policy, and Management, and his research supervisor Dr Maggi Kelly, were interested in how UAV imagery, a convolutional neural network (CNN) deep-learning (DL) algorithm and Trimble’s eCognition object-based image analysis (OBIA) technology could be used to map multi-age citrus trees accurately. eCognition is an OBIA information extraction technology that uses user-defined processing workflows called rulesets to automatically detect and classify specified objects and map them.

“UAV imagery, with its ability to provide sub-metre imagery on demand, has the potential to revolutionise precision agriculture workflows,” says Csillik. “eCognition can not only handle the complexity and large data volumes of UAV data, it offers unlimited opportunities to use its algorithms to analyse spatial data and identify any object.”

The timing was opportune for Csillik to acquire UAV data as a proof-of-concept alternative to NAIP imagery, and to test the feasibility of using a CNN-OBIA method to automatically analyse the UAV imagery and accurately identify and map individual citrus trees – the first test of its kind.

Citrus at the centre

UAV imagery was acquired over the entire LREC site using a Parrot Sequoia multi-spectral camera with average along- and across-track overlaps of 75% and 75% respectively. The UAV flew at an altitude of 104m and used SRTM terrain data to ensure consistent altitudes across the captured area. In two flights, the UAV captured 4,574 multispectral images, which were then photogrammetrically processed to produce a four-band orthoimagery mosaic with a ground sample distance of 12.8cm. The red and near infrared bands were also used to create a normalised difference vegetation index (NDVI) image. Both datasets were used as source data for eCognition.

With the image layers prepared, Csillik could focus on developing the software’s rule set to process and classify the data. Since they used intensively managed citrus trees at the LREC, there were high diversities of tree characteristics – pruned, unpruned, different ages, different geometries, touching crowns – so he needed a ruleset that would enable eCognition to delineate the trees under these varied conditions. Csillik’s approach would enhance the software’s inherent analysis and object-detection aptitude by adding in eCognition’s new deep-learning, CNN algorithms.

The first step of the ruleset involved training the CNN model with three classes – trees, bare soil and weeds – with 4,000 training samples per class. To create the samples, Csillik input the four-band orthomosaic into eCognition, along with the tree locations from the LREC’s trees database, and instructed the software to break up the mosaic into 40x40 pixel samples around those points. Sample collection was limited to the northern half of the LREC, leaving the southern half available for validation. Since the classes of the 12,000 samples were known, they could be used to teach the network to differentiate between classes by learning the specific features of each class. A quick study, the CNN completed its training in 13 minutes.

Now he was ready to run the second phase of the eCognition rule set: testing the newly trained CNN to identify and delineate individual citrus trees. Focusing on the southern half of the study site, each 40x40 pixel region was submitted to the network to obtain a likelihood that this image patch contains a citrus tree. Moving the analysis region like a sliding-window across the orthomosaic, it took eCognition’s CNN two minutes to produce a probability ‘heat map’, a grayscale image where higher (brighter) values correspond to likely tree locations.

The software then further processed the heat map to refine the tree objects and extract the tree centre. Adding in the NDVI layer, Csillik used a simple linear iterative clustering algorithm to segment the heat map and the NDVI layer into another newly integrated feature called superpixels. With this process, eCognition constrains the segmentation towards circular objects in the heat map and constrains it to differentiate between green and bare soil in the NDVI layer, enabling it to remove multiple crown-detection errors and refine its tree delineation, improving the detection and location of single citrus trees. This task took less than 10 seconds.

Quick work

In total, eCognition analysed, identified and delineated 3,105 individual trees in 30 minutes.

To validate how well the approach performed, Csillik compared the CNN/OBIA-generated results with the LREC’s existing trees database. Using common evaluation statistics, he calculated the overall accuracy of his debut approach and final classification was 96.2%.

“Having worked with eCognition for over seven years, I was confident that the integrated CNN would perform well,” says Csillik. “But it was a nice surprise to see how well it performed. With 96% accuracy, that level of automated precision could provide the ability to identify and delineate individual trees using automated methods as an alternative to manual delineation. Applying this process to high-resolution UAV imagery would offer a much faster, precise and repeatable method for long-term crop management for growers and the LREC.”

The LREC is indeed intrigued by the idea. After seeing Csillik’s classified map of citrus trees, they initiated plans to apply the approach to other study areas, explore how transferable it is and determine how easily they can update their tree maps over time.

Csillik will be the first to say that more research on pairing UAV imagery with CNN-OBIA technology is needed to fully realise how this approach can better the precision agriculture industry. But he is optimistic, both about this initial result and future ones to come. That could mean more golden opportunities for California citrus growers and researchers.

Mary Jo Wagner is a freelance writer who has been writing about the geospatial industry for more than 25 years ([email protected])

Download a PDF of this article

Download