Hostname: page-component-5cf477f64f-pw477 Total loading time: 0 Render date: 2025-04-07T13:46:44.797Z Has data issue: false hasContentIssue false

Predicting maize yield loss with crop–weed leaf cover ratios determined with UAS imagery

Published online by Cambridge University Press:  10 February 2025

Avi Goldsmith
Affiliation:
Graduate Research Assistant, Department of Crop and Soil Sciences, North Carolina State University, Raleigh, NC, USA
Robert Austin
Affiliation:
Research and Extension Specialist, Department of Crop and Soil Sciences, North Carolina State University, Raleigh, NC, USA
Charles W. Cahoon
Affiliation:
Associate Professor, Department of Crop and Soil Sciences, North Carolina State University, Raleigh, NC, USA
Ramon G. Leon*
Affiliation:
William Neal Reynolds Distinguished Professor and University Faculty Scholar, Department of Crop and Soil Sciences, North Carolina State University, Raleigh, NC, USA
*
Corresponding author: Ramon G. Leon; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Typically, weed density is used to predict weed-induced yield loss, as it is easy and quick to quantify, even though it does not account for weed size and time of emergence relative to the crop. Weed–crop leaf area relations, while more difficult to measure, inherently account for differences in plant size, representing weed–crop interference more accurately than weed density alone. Unmanned aerial systems (UASs) may allow for efficient quantification of weed and crop leaf cover over a large scale. It was hypothesized that UAS imagery could be used to predict maize (Zea mays L.) yield loss based on weed–crop leaf cover ratios. A yield loss model for maize was evaluated for accuracy using 15- and 30-m-altitude aerial red–green–blue and four-band multispectral imagery collected at four North Carolina locations. The model consistently over- and underpredicted yield loss when observed yield loss was less than and greater than 3,000 kg ha−1, respectively. Altitude and sensor type did not influence the accuracy of the prediction. A correction for the differences between predicted and observed yield loss was incorporated into the linear model to improve overall precision. The correction resulted in r2 increasing from 0.17 to 0.97 and a reduction in root mean-square error from 705 kg ha−1 to 219 kg ha−1. The results indicated that UAS images can be used to develop predictive models for weed-induced yield loss before canopy closure, making it possible for growers to plan production and financial decisions before the end of the growing season.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of Weed Science Society of America

Introduction

Weed-induced yield loss predictions have been attempted using a variety of different approaches. In density-based models, weed density and yield loss follow a hyperbolic relationship, meaning that as weed density increases, it will have a reduced impact on yield loss as this parameter approaches a threshold (Cousens Reference Cousens1985; Spitters et al. Reference Spitters, Kropff and Groot1989). As density alone does not inherently provide information about relative weed size or time of emergence compared with the crop, its predictive power for yield loss may be low and variable (Ali et al. Reference Ali, Streibig and Andreasen2013; Kropff Reference Kropff1988). This problem is more evident when late-emerging cohorts of weeds are present, because the model intrinsically assumes that they have a competitive potential similar to early-emerging weeds (Ali et al. Reference Ali, Streibig and Andreasen2013; Jeschke et al. Reference Jeschke, Stoltenberg, Kegode, Sprague, Knezevic, Hock and Johnson2011).

In contrast, the relative weed leaf area compared with the crop (L w) is a parameter that implicitly considers differences in plant size. The L w is defined as:

(1) $${L_{\rm{w}}} = {{{\rm{LA}}{{\rm{I}}_{\rm{w}}}} \over {{\rm{LA}}{{\rm{I}}_{\rm{c}}} + {\rm{LA}}{{\rm{I}}_{\rm{w}}}}}$$

where LAIw is the leaf area index (LAI) of the weeds and LAIc is the LAI of the crop for a given ground area and time. Through combining the leaf area of the weeds and crop per ground area basis, L w can provide a better representation of weed interference, especially due to shading (Kropff Reference Kropff1988; Spitters and Aerts Reference Spitters and Aerts1983). A single, large early-emerging weed may have the same leaf area as several smaller late-emerging weeds, which allows their competitive ability to be weighted proportionally (Spitters and Aerts Reference Spitters and Aerts1983). As LAIw will likely include weeds from different cohorts, L w accounts for the relative time of emergence of weeds based on the additive contribution of individuals to the total LAI (Kropff Reference Kropff1988; Spitters and Aerts Reference Spitters and Aerts1983). Thus, early- and late-emerging weeds result in large and small plants at the critical time of weed control, respectively.

Kropff and Spitters (Reference Kropff and Spitters1991) proposed using L w to predict yield loss, because it directly influences growth rate through shading mechanisms. Thus, if LAIw represents a small proportion of the total canopy, the resulting weed growth rates will be lower and the ones of the crop higher. Conversely, as LAIw increases, the crop growth rate will be slowed by weed shading, causing greater yield losses. These weed–crop interactions are affected by the competitive ability of the weed population against the crop. Therefore, the competitive ability of the weeds must be weighed proportionally to its contribution to the L w. To do this, Kropff and Spitters (Reference Kropff and Spitters1991) proposed to incorporate L w and a weed competition coefficient (q) into the model to predict weed-induced yield loss:

(2) $${\rm{Y}}{{\rm{L}}_k} = {{q{L_w}} \over {1 + \left( {q - 1} \right){L_w}}}$$

where YL k is yield loss relative to the yield under weed-free conditions (Y wf) (Kropff and Spitters Reference Kropff and Spitters1991). The q value represents the intensity of competitive interactions between the weed community and the crop. A q < 1 indicates that the crop is more competitive than the weed community, and q > 1 indicates the weed is more competitive than the crop (Kropff and Spitters Reference Kropff and Spitters1991).

While L w is more biologically relevant for yield loss prediction compared with density, historically, it has not been used in research or production, as leaf area could not be quickly measured. Recently, advances in digital image analysis opened the possibility of quantifying weed and crop leaf cover efficiently during the growing season (Campillo et al. Reference Campillo, Prieto, Daza, Moñino and García2008; Lee and Lee Reference Lee and Lee2011). Thus, Equation 1 can be modified for the relative weed leaf cover (LCw) by replacing LAI with the leaf cover index (LCI) (total leaf cover per ground area basis) of the crop (LCIc) and weeds (LCIw). LCw has been shown to be analogous to L w in the early growing season before canopy closure, as little leaf overlap occurs (Lotz et al. Reference Lotz, Kropff, Wallinga, Bos and Groeneveld1994; Ngouajio et al. Reference Ngouajio, Lemieux and Leroux1999; Nielsen et al. Reference Nielsen, Miceli-Garcia and Lyon2012). Equation 2 is then modified for leaf cover by replacing L w with LCw.

To further improve simplicity and efficiency of LCw data collection, remote sensing tools such as unmanned aerial systems (UASs) can be used to measure LCIw and LCIc (Duan et al. Reference Duan, Zheng, Guo, Ninomiya, Guo and Chapman2017; Rasmussen et al. Reference Rasmussen, Nielsen, Garcia-Ruiz, Christensen and Streibig2013). UASs are highly flexible tools, as they can quickly capture imagery at the field scale when flying at high altitudes and capture high-resolution images at the plant scale at lower altitudes (i.e., <1 cm pixel−1 ground sample distance [GSD]). Determination of optimal flight and equipment parameters is critical to accurately distinguish crops and weeds. Some of those parameters include, but are not limited to, sensor type, sensor pixel density, atmospheric/weather conditions, and the crops in question (Mesas-Carrascosa et al. Reference Mesas-Carrascosa, Torres-Sánchez, Clavero-Rumbao, García-Ferrer, Peña, Borra-Serrano and López-Granados2015). It is therefore necessary to compare LCw measured at multiple flight altitudes and with different sensor types to balance data-collection efficiency with image resolution.

UASs can be equipped with several sensor types, including red–green–blue (RGB) cameras, which capture light in the visual range of the electromagnetic spectrum (380 to 700 nm), and multispectral (MS) cameras, which sense light within and beyond the visual range in narrow discrete ranges of wavelengths (bands) (NASA 2010). Through image analysis, crops can be distinguished from weeds based on differences in spectral reflectance using machine learning algorithms (Venkataraju et al. Reference Venkataraju, Arumugam, Stepan, Kiran and Peters2023). MS imagery has been shown to improve plant species differentiation compared with RGB imagery, especially in the near-infrared light range (700 to 1,100 nm) (Mohidem et al. Reference Mohidem, Che’Ya, Juraimi, Fazlil Ilahi, Mohd Roslim, Sulaiman, Saberioon and Mohd Noor2021). Additionally, MS imagery can provide more stable canopy cover estimation over time compared with RGB, as it is less sensitive to changes in environmental conditions such as cloud cover and sunlight angle. In other words, MS imagery generates less spectral noise (Ashapure et al. Reference Ashapure, Jung, Chang, Oh, Maeda and Landivar2019; Zhou et al. Reference Zhou, Jung and Yu2020).While potentially more precise for vegetation differentiation and canopy cover estimation, MS imagery might be inaccessible to some due to greater sensor prices compared with RGB imagery (Ashapure et al. Reference Ashapure, Jung, Chang, Oh, Maeda and Landivar2019).

With UASs, the feasibility of implementing the Kropff-Spitters model in real time during the growing season before canopy closure might be possible. We considered that this possibility was worth exploring to provide more information to growers about crop performance in relation to the efficacy of their weed management programs. It was hypothesized that weed-induced yield loss can be predicted with the Kropff-Spitters model adjusted for leaf cover (Equation 2) where maize (Zea mays L.) and weed leaf cover is measured with high-resolution UAS imagery. Additionally, it was hypothesized that prediction accuracy is higher with 15-m MS imagery, as it will have a finer GSD and be less sensitive to spectral noise compared with RGB or 30-m MS imagery. Therefore, the goals of this study were (1) to evaluate the accuracy of the Kropff-Spitters model to predict maize yield loss based on UAS measured leaf cover and (2) to assess model performance at different UAS flight altitudes and with different sensor types.

Materials and Methods

Experimental Sites

Images were collected from four existing maize experiments during the summer 2023 growing season in Goldsboro, Clayton, and Lewiston, North Carolina, at the Cherry, Central Crops, and Peanut Belt Research Stations respectively. In Goldsboro (35.3942°N, 78.04709°W), the soil is a Pantego-loam (fine-loamy, siliceous, semiactive, thermic Umbric Paleaquult) with 1.9% organic matter. In Clayton (35.6760°N, 78.5141°W), the soil is a Dothan loamy-sand (fine-loamy, kaolinitic, thermic Plinthic Kandiudult) with 1.25% organic matter. The soil of location 1 in Lewiston (Lewiston 1) (36.1332°N, 77.1772°W) is a mosaic of Goldsboro sandy loam (fine-loamy, siliceous, subactive, thermic Aquic Paleudult) and Norfolk sandy loam (fine-loamy, kaolinitic, thermic Typic Kandiudult). The soil of location 2 in Lewiston (Lewiston 2) (36.1343°N,77.1772°W) is a Norfolk sandy loam (fine-loamy, kaolinitic, thermic Typic Kandiudult). Both Lewiston locations had 1.5% organic matter. The most common weed species were sicklepod [Senna obtusifolia (L.) H.S. Irwin and Barneby] and fall panicum (Panicum dichotomiflorum Michx.), in Goldsboro; Palmer amaranth (Amaranthus palmeri S. Watson), Texas millet [Urochloa texana (Buckley) R. Webster], and ivyleaf morningglory (Ipomoea hederacea Jacq.) in Clayton; and common ragweed (Ambrosia artemisiifolia L.), common lambsquarters (Chenopodium album L.), and U. texana in Lewiston. In Goldsboro, the experiment was planted with maize variety ‘DKC63-57(VT2P)’ in 18 plots measuring 12 by 30 m with 76-cm row spacing, while Clayton and Lewiston were planted in 54 and 72 plots measuring 3.66 by 9.14 m with ‘DeKalb 62-08’ and ‘Pioneer 1870’ variety maize, respectively (Table 1). Experiments were arranged as a randomized complete block design with three replications in Goldsboro and four in Clayton and Lewiston, with standard fertilization according to North Carolina State University Extension recommendations (Crozier Reference Crozier2019). Clayton and Lewiston were prepared with conventional tillage according to North Carolina State University Extension recommendations (Crozier Reference Crozier2019), while Goldsboro was managed under no-till practices. Maize yield was determined by harvesting the inner 12 rows in Goldsboro and all 4 rows per plot in Lewiston 2 using a 6-row commercial harvester and the inner 2 rows in Clayton and Lewiston 1 with a small-plot harvester. Yields at all locations were adjusted to 15.5% moisture content.

Table 1. Maize planting and management programs for four locations in North Carolina.

a D1 and D2 represent the two maize planting densities in Goldsboro, which were included to produce additional variation in weed/maize leaf cover ratios.

b Two postemergence herbicide applications were included in Goldsboro to keep control plots weed-free.

Aerial Image Collection

At Goldsboro, aerial images were collected when maize was at the V3 and V5 stages (Ritchie et al. Reference Ritchie, Hanway and Benson1986). In Clayton, images were collected when maize was at V5 and V7, and at both Lewiston locations, images were collected at V8, just before canopy closure. Variation in sampling time was due to the need to circumvent the local weed control actions and properly quantify LCw before weed removal. At all locations, RGB and four-band MS (red, green, red-edge, and near-infrared) images were collected at flight altitudes of 15 and 30 m using a DJI Mavic 3M (DJI Shenzhen, China) (Table 2). GSD was 0.47 cm pixel−1 with RGB-15-m imagery, 0.87 cm pixel−1 with RGB-30-m imagery, 0.66 cm pixel−1 with MS-15-m imagery, and 1.45 cm pixel−1 with MS-30-m imagery. Image collection was done around solar noon (1200 to 1400 hours). For each location and date, images of each spectral and altitude treatment were individually stitched into maps (i.e., orthomosaic) using Agisoft Metashape v. 2.0.1 (St Petersburg, Russia), producing a total of 32 unique orthomosaics. Each MS orthomosaic was a composite of all four bands collected at that image collection.

Table 2. DJI Mavic 3M RGB and MS sensor and RTK specifications a .

a Abbreviations: MS, multispectral; RGB, red–green–blue; RTK, real-time kinematic; CMOS, complementary metal-oxide-semiconductor.

Ground Sampling/Processing

To validate the substitution of L wf or LCw (Equations 1 and 2), LAIc and LCIc were compared for each aerial image collection. LAIc was collected twice per plot in Goldsboro and randomly in a third of the plots in Clayton and Lewiston with an LAI 2200C plant canopy analyzer (LI-COR, Lincoln, NE, USA) immediately following aerial image collection. In Goldsboro, LAIc was collected within 0.4-m2 quadrats where weeds were removed. In Clayton and Lewiston, LAIc was collected below the maize canopy but above the weeds (50 cm) between the two middle rows of the plot. In all locations a 90° view cap was used to block the user from LAI readings. LCIc was measured on a per plot basis for each orthomosaic following aerial data processing.

Aerial Data Processing

To distinguish between maize and weeds in the aerial imagery, an individual supervised object-based classification algorithm via a support vector machine (SVM) (Cortes and Vapnik Reference Cortes and Vapnik1995) was trained for each orthomosaic using ArcGIS Pro v. 3.1.1 (Redlands, CA, USA) (Figure 1). Training data for each SVM were developed by manually classifying the area within eight random 4-m2 quadrats as soil, maize, or weeds on the corresponding orthomosaic. Additional training samples were generated if irregular atmospheric conditions (e.g., intermittent cloud cover) caused classification errors. The area per plot classified as maize and weeds was defined as its LCIc and LCIw, which was used to calculate a per plot LCw (Equation 1) for each orthomosaic. To determine the accuracy of LCw estimation based on the SVM, confusion matrices were developed by manually labeling 500 random points for each RGB-15-m classified map as maize, weeds, or soil via the accuracy assessment tool in ArcGIS Pro v. 3.1.1.

Figure 1. Example of output of maize and weed classification from red–green–blue (RGB)-15-m imagery using a supervised object-based classification algorithm via a support vector machine.

Yield Loss Prediction

To predict yield loss, Y wf was defined as the highest-yielding plot per block within each location where there was minimal to nondetectable weed cover. Observed yield loss (YLobs) was defined as the difference between Y wf and Y obs within each block. The predicted yield loss (YLp) relative to Y wf was then calculated (in kg ha−1) with the formula:

(3) $${\rm{Y}}{{\rm{L}}_{\rm{p}}}\; = \;\left( {{{q{\rm{L}}{{\rm{C}}_{\rm{w}}}} \over {1\; + \;\left( {q - 1} \right){\rm{L}}{{\rm{C}}_{\rm{w}}}}}\;} \right)\,{Y_{{\rm{wf}}}}$$

To determine q for each location and stage, YLp was calculated iteratively with q values ranging from 0.1 to 1.5 with RGB-15-m-derived LCw. RGB-15-m training data were used rather than MS or 30-m as they presents the widest range and balance of spectral signals and optical resolution (pixels cm−1). The q that produced the smallest absolute difference between YLp and YLobs was determined as the optimum q for each, which will be referred to as q o.

Model Correction

A correction factor (c) was generated to compensate for changes in model performance due to bias when substituting L w for LCw (Equation 2). Because q and LCw values for each image collection account for variation among locations and phenological stages (Kropff and Spitters Reference Kropff and Spitters1991; Ngouajio et al. Reference Ngouajio, Lemieux and Leroux1999), a single c was developed with RGB-15-m training data from the location with the greatest standard error in LCw and validated with the rest of the locations. This was done to account for variation in crop–weed ratios without overfitting the model to all collected data. Where ΔYL is the difference between YLp and YLobs, c was calculated based on the slope, when different than zero, of the model that best described the relationship between ΔYL and YLobs. For each plot, the corrected yield loss prediction (YLcorr) was calculated as:

(4) $${\rm{Y}}{{\rm{L}}_{{\rm{corr}}}} = \left( {\Delta {\rm{YL\;}} \times {\rm{\;}}c} \right) + {\rm{Y}}{{\rm{L}}_{\rm{p}}}$$

To assess the necessity of running the model with an individual q o for each image collection, an additional analysis was done with the average q value over all locations and crop phenological stages (q ). If q provided similar accuracy as q o for each location, then the former was considered robust and was chosen for the following analyses.

Statistical Analysis

Linear regression analyses were done with JMP Pro 17 (Cary, NC, USA) to compare model performance across locations, phenological stages, UAS imagery types, and weed competition measurements. Linear regressions were additionally run to validate the use of LCI in place of LAI. Root mean-square error (RMSE), r2, and P-value were used to characterize model regression fitness and accuracy.

Results and Discussion

UAS-derived Leaf Estimated Cover Accuracy

LAIc was positively related with LCIc measurements across all locations and phenological stages with RGB-15m imagery (r2 = 0.72; Figure 2). Overall classification accuracy of the SVM was 83%, 76%, and 96% for maize, weeds, and soil, respectively, and the kappa coefficient, which takes into account both the generator and user’s accuracy when training the classifier, was 0.78 (Cohen Reference Cohen1960). Given the high accuracy for the three classes and the strong linear relationship with LAIc, LCI was used as a surrogate for LAI.

Figure 2. Relation between unmanned aerial system (UAS) red–green–blue (RGB)-15-m-derived maize leaf cover index (LCI) and ground-measured maize leaf area index (LAI) in four locations in North Carolina.

Determining qo and q for the Kropff-Spitters Model

Weed competition was described with a range of q o values across locations and phenological stages, which ranged from 0.2 at V5 in Clayton to 0.8 at V8 in Lewiston 2 (Table 3). The q across all 268 plots was 0.54, indicating that, on average, maize was more competitive than weeds regardless of early-season phenological stage, as q < 1 (Kropff and Spitters Reference Kropff and Spitters1991). Compared with Goldsboro, the competition coefficients at Clatyon V7 and Lewiston were 2.3 and 2.7 times greater, respectively (Kropff and Spitters Reference Kropff and Spitters1991; Table 3). The stability of q o in Goldsboro at different maize growth stages suggests that the weed community’s competitive ability grew proportionally to that of maize, while in Clayton, the weed community progressively increased its competitive ability over maize. It is expected that as the growing season progresses and the weeds in each community get larger, their competitive abilities will increase, as observed in Clayton. Additionally, Goldsboro had considerably less weed pressure and species diversity than other locations, likely resulting in its lower q o (Table 3).

Table 3. Red–green–blue (RGB)-15-m-derived Kropff and Spitters model attributes and observed yield measurements for all locations and stages in North Carolina.

a Optimized competition coefficient.

b Relative weed leaf cover (Equation 2). Values in parentheses represent the standard error of the mean.

Yield Loss Prediction and Correction

There were no major differences between the q o and q models, with both having c = −0.69. Additionally, the slopes of the q o and q corrected models were 0.78 ± 0.01 and 0.75 ± 0.01, respectively, with intercepts being 214 ± 13 and 227 ± 13. Therefore, q was chosen for further analyses to use a single value that applied to all locations and sampling times.

Running the model without correcting deviation from YLobs resulted in low accuracy for YLp (Figure 3A). This was more evident when YLobs was greater than and less than 3,000 kg ha−1, where YLp was over- and underpredicted, respectively (Figure 4). This trend may be due to the use of LCI in place of LAI, as leaf cover is not able to capture the full magnitude of competition when leaf overlap occurs and at early stages when there is no overlap at all. To account for differences between the two, c was developed using RGB-15-m training data from Clayton (Figure 4). Clayton was selected because it had the highest overall variability in LCw out of the four locations (Table 3). If the model accurately measured YLp, we would expect the slope in Figure 4 to be zero, indicating no difference between YLp and YLobs. However, the slope for the model (i.e., c) was −0.69 (Figure 4). Because the model was highly linear, incorporating c (Equation 4) dramatically reduced the error of the model and corrected yield loss predictions (YLcorr) from YLobs in comparison to YLp (Figure 3A and 3B). When corrected for LCI (Equation 4), model performance increased in the validation data set (Goldsboro and Lewiston locations) with r2 increasing from 0.17 to 0.97 and RMSE decreasing from 705 kg ha−1 to 219 kg ha−1 (Figure 3).

Figure 3. Relationship between predicted (q ) and observed yield loss at validation locations (Lewiston 1 and 2 and Goldsboro) in North Carolina with all red–green–blue (RGB) and multispectral (MS) imagery taken at 15 m and 30 m pooled together. (A) Relationship between predicted and observed yield loss before incorporation of the correction factor c; (B) relationship after incorporation of c.

Figure 4. Relationship between Δ yield loss (difference between predicted and observed yield loss) and observed yield loss at Clayton V5 and V7 (red–green–blue [RGB]-15-m) in North Carolina using the q analysis. The red dashed line represents Δ yield loss = 0.

Flight altitude and image sensor had minimal effect on the accuracy of the prediction (Table 4). Before correction, the relation between YLobs and YLp was low for all sensor–altitude combinations (r2 < 0.14) except for RGB-15-m (r2 = 0.53). However, when YLcorr was estimated, all sensors and altitudes performed equally well with r2 > 0.96. When comparing pre- and postcorrection, the correction also allowed the overall RMSE to be reduced 3-fold, from 650 kg ha−1 to 202 kg ha−1 (Table 4).

Table 4. Best-fit linear model parameters for the relation between YLobs and YLp for different sensor–altitude combinations based on the ${q_{\bar x}}$ analysis for data collected in Goldsboro and Lewiston 1 and 2 in North Carolina a .

a Values in parentheses represent the standard error of the mean.

b For all models, P < 0.0001.

c RMSE, root mean-square error.

Caveats

Considerations for q Estimation

Weed communities will vary in their competitive ability relative to the crop due to environmental conditions, weed species composition, and crop species and planting density. Therefore, while a single value of q, like q used in our study, would be ideal, it is unlikely the value presented could accurately predict yield loss across all environmental conditions, management strategies, and growth stages. Instead, through iterative testing and validation, general values of q can be determined for weed communities with a few predominant species and at specific growth stages. This type of general descriptor is not unusual in agriculture. For example, individual evapotranspiration coefficients have been developed for multiple crops and are used across large areas and in different environments and soil types (Hargreaves and Samani Reference Hargreaves and Samani1985; Hatfield and Dold Reference Hatfield and Dold2018).

Substitution of LAI for LCI

The estimation of total leaf area and LAI from aerial imagery is a difficult process. The equipment currently available for this purpose, such as light detection and ranging (LIDAR) or multisensor cameras, is expensive and technically complex, making it difficult to use for agricultural production at present (Liu et al. Reference Liu, Jin, Nie, Wang, Yu, Cheng, Shao, Wang, Tuohuti, Bai and Liu2021). For this reason, we explored substituting total leaf area with leaf cover. The latter can be accurately estimated with UAS using on low-cost sensors. It must be acknowledged that when using leaf cover in place of total leaf area, competition for light within the canopy and its impact on yield loss might not be fully captured due to occlusion among weeds and the crop. Leaf area is associated with physiological processes that can determine how competitive a plant will be within a stressful environment where resources such as water and nutrients are limited (Kropff Reference Kropff1988). For all row crops, a critical period of weed control (CPWC) exists early in the season before crop canopy closure when any weeds present have the greatest impact on yield (Gantoli et al. Reference Gantoli, Ayala and Gerhards2013; Hall et al. Reference Hall, Swanton and Anderson1992). After the CPWC, the crop typically has grown enough to shade smaller late-emerging weeds, reducing their competitive ability (Gantoli et al. Reference Gantoli, Ayala and Gerhards2013; Hall et al. Reference Hall, Swanton and Anderson1992). Furthermore, during CPWC is when growers can implement effective postemergence control (e.g., cultivation, herbicides) while minimizing crop injury. Thus, image collection is ideal during CPWC for the Kropff and Spitters model. Previous studies have shown that while some occlusion may still occur among leaves (Figure 1), leaf cover and leaf area are similar early in the season (Nielsen et al. Reference Nielsen, Miceli-Garcia and Lyon2012; Ramirez-Garcia et al. Reference Ramirez-Garcia, Almendros and Quemada2012). We contend that our results illustrate that UAS-derived leaf cover, collected at the right time, can capture enough detail in LCw in comparison to LAIw to make valid yield loss predictions, especially with the incorporation of a correction factor. This is further validated, because the correction factor developed with imagery from Clayton (Figure 4) successfully improved model performance in Lewiston and Goldsboro (Figure 3).

Machine learning and artificial intelligence approaches are starting to contribute to more accurate LAI estimations from aerial imagery, which might improve the robustness of models and algorithms predicting yield loss. For example, using data from multiple sensors (RGB, multiple MS sensors), Liu et al. (Reference Liu, Jin, Nie, Wang, Yu, Cheng, Shao, Wang, Tuohuti, Bai and Liu2021) predicted LAI using aerial images and shallow and deep machine learning in experimental maize plots. The accuracy of their predictions was estimated at r2 = 0.70 to 0.89 and relative RMSE = 13%. In our case, we were able to relate LCI from aerial images to ground-truth LAI at r2 = 0.70 and relative RMSE = 7% (Figure 2). Therefore, our simpler algorithm was representing LAI in a manner similar to more complex approaches based on machine learning. Interestingly, Liu et al. (Reference Liu, Jin, Nie, Wang, Yu, Cheng, Shao, Wang, Tuohuti, Bai and Liu2021) also observed over- and underpredictions at low and high LAI, respectively. This was similar to our yield predictions as a function of leaf area (Figure 4). Therefore, even with more detailed image processing systems, it seems that leaf occlusion is an intrinsic limitation of UAS image collection after CPWC (i.e., canopy closure).

Correcting for Differences in Predicted versus Observed Yield Loss

Through substitution of LAI for LCI a bias was introduced in the Kropff-Spitters model (Equation 2), creating the need for a correction factor. When YLobs was <3,000 kg ha−1, the uncorrected model overpredicted the impact of weeds on the crop. Very low maize yield indicates the presence of suboptimal growing conditions (e.g., soil type, moisture, pest damage). Therefore, it is possible that adding weed interference to a situation where other more important limiting factors were present caused YLp overestimation. Conversely, when maize growth and yield potential are high, fast canopy closure can result in high levels of occlusion among weed leaves with crop leaves, because UAS imagery cannot penetrate the canopy. In this case, the estimation of LCw is reduced, because weeds might shade maize within the canopy out of view of UAS imagery. Also, the relation between leaf cover and underground or noncompetitive interactions involved in yield loss is minimized, as this model cannot quantify those at low LCw levels (McKenzie-Gopsill et al. Reference McKenzie-Gopsill, Amirsadeghui, Earl, Jones, Lukens, Lee and Swanton2019; Stone et al. Reference Stone, Cralle, Chandler, Miller, Bovey and Carson1998).

The present study confirmed that leaf area–based prediction models, such as the one proposed by Kropff and Spitters (Reference Kropff and Spitters1991), can make use of UAS imagery as a viable and practical tool for estimating leaf cover. While LCw initially resulted in poor model performance, predictions were greatly improved through incorporation of a correction factor. Furthermore, combined with UAS imagery, the predictive model can be used in a practical and efficient manner at large spatial scales within commercial growing operations. Through automation of image analysis and optimization of the model, growers could generate yield loss predictions in short time frames after UAS image collection. This would allow growers to make management and financial decisions—using additional postemergence herbicide or fertilizer applications before label cutoff times at canopy closure—as early as possible. Regarding model efficiency, because accuracy was similar between RGB and MS sensors regardless of the altitude in the range evaluated, future optimization of the system could focus on RGB imagery and higher altitudes (≥30 m). This would allow UAS pilots to collect imagery of larger fields in less time and allow for an increase in computational efficiency. Ultimately, the end goal of systems like the one studied here is to give growers the power to make financial and management decisions as early as possible and gain time to anticipate ways to ensure the sustainability of their operations.

Acknowledgments

The authors thank the staff at the various research farms and Zachary Taylor for technical support.

Funding statement

This research was supported by the U.S. Department of Agriculture–National Resources Conservation Service Conservation Innovation grant no. NR213A750013G031, the U.S. Department of Agriculture Area-Wide Funds, Hatch Project NC02906, and the North Carolina Agricultural Foundation, Inc.

Competing interests

The authors declare no conflicts of interest.

Footnotes

Associate Editor: Nathan S. Boyd, Gulf Coast Research and Education Center

References

Ali, A, Streibig, JC, Andreasen, C (2013) Yield loss prediction models based on early estimation of weed pressure. Crop Prot 53:125131 Google Scholar
Ashapure, A, Jung, J, Chang, A, Oh, S, Maeda, M, Landivar, J (2019) A comparative study of RGB and multispectral sensor-based cotton canopy cover modelling using multi-temporal UAS data. Remote Sens 11:2757 Google Scholar
Campillo, C, Prieto, MH, Daza, C, Moñino, MJ, García, MI (2008) Using digital images to characterize canopy coverage and light interception in a processing tomato crop. HortScience 43:17801786 Google Scholar
Cohen, J (1960) A coefficient of agreement for nominal scales. Educ Psychol Meas 20:3746 Google Scholar
Cortes, C, Vapnik, V (1995) Support-vector networks. Mach Learn 20:273297 Google Scholar
Cousens, R (1985) A simple model relating yield loss to weed density. Ann Appl Biol 107:239252 Google Scholar
Crozier, C (2019) Soil management. Chapter 9 in North Carolina Organic Commodities Production Guide. AG-660. NC State Extension Publications. https://content.ces.ncsu.edu/north-carolina-organic-commodities-production-guide/chapter-9-soil-management. Accessed: December 12, 2023Google Scholar
Duan, T, Zheng, B, Guo, W, Ninomiya, S, Guo, Y, Chapman, SC (2017) Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV. Funct Plant Biol 44:169 Google Scholar
Gantoli, G, Ayala, VR, Gerhards, R (2013) Determination of the critical period for weed control in corn. Weed Technol 27:6371 Google Scholar
Hall, MR, Swanton, CJ, Anderson, GW (1992) The critical period of weed control in grain corn (Zea mays). Weed Sci 40:441447 Google Scholar
Hargreaves, GH, Samani, ZA (1985) Reference crop evapotranspiration from temperature. Appl Eng Agric 1:9699 Google Scholar
Hatfield, JL, Dold, C (2018) Climate change impacts on corn phenology and productivity. In Amanullah K, Fahad S, eds. Corn–Production and Human Health in Changing Climate. InTech. doi: 10.5772/intechopen.76933 Google Scholar
Jeschke, MR, Stoltenberg, DE, Kegode, GO, Sprague, CL, Knezevic, SZ, Hock, SM, Johnson, GA (2011) Predicted soybean yield loss as affected by emergence time of mixed-species weed communities. Weed Sci 59:416423 Google Scholar
Kropff, MJ (1988) Modelling the effects of weeds on crop production. Weed Res 28:465471 Google Scholar
Kropff, MJ, Spitters, CJT (1991) A simple model of crop loss by weed competition from early observations on relative leaf area of the weeds. Weed Res 31:97105 Google Scholar
Lee, K-J, Lee, B-W (2011) Estimating canopy cover from color digital camera image of rice field. J Crop Sci Biotechnol 14:151155 Google Scholar
Liu, S, Jin, X, Nie, C, Wang, S, Yu, X, Cheng, M, Shao, M, Wang, Z, Tuohuti, N, Bai, Y, Liu, Y (2021) Estimating leaf area index using unmanned aerial vehicle data: shallow vs. deep machine learning algorithms. Plant Physiol 187:15511576 Google Scholar
Lotz, LAP, Kropff, MJ, Wallinga, J, Bos, HJ, Groeneveld, RMW (1994) Techniques to estimate relative leaf area and cover of weeds in crops for yield loss prediction. Weed Res 34:167175 Google Scholar
McKenzie-Gopsill, A, Amirsadeghui, S, Earl, H, Jones, A, Lukens, L, Lee, E, Swanton, C (2019) Early physiological and biochemical responses of soybean to neighbouring weeds under resource-independent competition. Weed Res 59:288299 Google Scholar
Mesas-Carrascosa, F-J, Torres-Sánchez, J, Clavero-Rumbao, I, García-Ferrer, A, Peña, J-M, Borra-Serrano, I, López-Granados, F (2015) Assessing optimal flight parameters for generating accurate multispectral orthomosaics by UAV to support site-specific crop management. Remote Sens 7:1279312814 Google Scholar
Mohidem, NA, Che’Ya, NN, Juraimi, AS, Fazlil Ilahi, WF, Mohd Roslim, MH, Sulaiman, N, Saberioon, M, Mohd Noor, N (2021) How can unmanned aerial vehicles be used for detecting weeds in agricultural fields? Agriculture 11:1004 Google Scholar
[NASA] National Aeronautics and Space Administration (2010) Visible Light. https://science.nasa.gov/ems/09_visiblelight. Accessed: March 19, 2024Google Scholar
Ngouajio, M, Lemieux, C, Leroux, GD (1999) Prediction of corn (Zea mays) yield loss from early observations of the relative leaf area and the relative leaf cover of weeds. Weed Sci 47:297304 Google Scholar
Nielsen, DC, Miceli-Garcia, JJ, Lyon, DJ (2012) Canopy cover and leaf area index relationships for wheat, triticale, and corn. Agron J 104:15691573 Google Scholar
Ramirez-Garcia, J, Almendros, P, Quemada, M (2012) Ground cover and leaf area index relationship in a grass, legume and crucifer crop. Plant Soil Environ 58:385390 Google Scholar
Rasmussen, J, Nielsen, J, Garcia-Ruiz, F, Christensen, S, Streibig, JC (2013) Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res 53:242248 Google Scholar
Ritchie, S, Hanway, JJ, Benson, GO (1986) How a Corn Plant Develops. Ames: Iowa Cooperative Extension Special Report No 48. 24 pGoogle Scholar
Spitters, CJT, Aerts, R (1983) Simulation of competition for light and water in crop-weed associations. Asp Appl Biol 4:467483 Google Scholar
Spitters, CJT, Kropff, MJ, Groot, W (1989) Competition between maize and Echinochloa crus-galli analysed by a hyperbolic regression model. Ann Appl Biol 115:541551 Google Scholar
Stone, M, Cralle, H, Chandler, J, Miller, T, Bovey, R, Carson, K (1998) Above- and belowground interference of wheat (Triticum aestivum) by Italian ryegrass (Lolium multiflorum). Weed Sci 46:438441 Google Scholar
Venkataraju, A, Arumugam, D, Stepan, C, Kiran, R, Peters, T (2023) A review of machine learning techniques for identifying weeds in corn. Smart Agric Technol 3:100102 Google Scholar
Zhou, K, Jung, C, Yu, S (2020) Scale-aware multispectral fusion of RGB and NIR images based on alternating guidance. IEEE Access 8:173197173207 Google Scholar
Figure 0

Table 1. Maize planting and management programs for four locations in North Carolina.

Figure 1

Table 2. DJI Mavic 3M RGB and MS sensor and RTK specificationsa.

Figure 2

Figure 1. Example of output of maize and weed classification from red–green–blue (RGB)-15-m imagery using a supervised object-based classification algorithm via a support vector machine.

Figure 3

Figure 2. Relation between unmanned aerial system (UAS) red–green–blue (RGB)-15-m-derived maize leaf cover index (LCI) and ground-measured maize leaf area index (LAI) in four locations in North Carolina.

Figure 4

Table 3. Red–green–blue (RGB)-15-m-derived Kropff and Spitters model attributes and observed yield measurements for all locations and stages in North Carolina.

Figure 5

Figure 3. Relationship between predicted (q) and observed yield loss at validation locations (Lewiston 1 and 2 and Goldsboro) in North Carolina with all red–green–blue (RGB) and multispectral (MS) imagery taken at 15 m and 30 m pooled together. (A) Relationship between predicted and observed yield loss before incorporation of the correction factor c; (B) relationship after incorporation of c.

Figure 6

Figure 4. Relationship between Δ yield loss (difference between predicted and observed yield loss) and observed yield loss at Clayton V5 and V7 (red–green–blue [RGB]-15-m) in North Carolina using the q analysis. The red dashed line represents Δ yield loss = 0.

Figure 7

Table 4. Best-fit linear model parameters for the relation between YLobs and YLp for different sensor–altitude combinations based on the ${q_{\bar x}}$ analysis for data collected in Goldsboro and Lewiston 1 and 2 in North Carolinaa.