Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-24T18:16:13.615Z Has data issue: false hasContentIssue false

Spectral discrimination of crops and weeds using deep learning assisted by wavelet transform and statistical preprocessing

Published online by Cambridge University Press:  12 November 2024

Vahid Mohammadi
Affiliation:
Doctoral Student, Biosystems Engineering Department, Faculty of Agriculture, Tarbiat Modares University, Tehran, Iran, and ImViA, UFR Sciences et Techniques, Université de Bourgogne, Franche-Comté, Dijon, France
Saeid Minaei*
Affiliation:
Professor, Biosystems Engineering Department, Faculty of Agriculture, Tarbiat Modares University, Tehran, Iran
Pierre Gouton
Affiliation:
Professor, ImViA, UFR Sciences et Techniques, Université de Bourgogne, Franche-Comté, Dijon, France
Ali Reza Mahdavian
Affiliation:
Assistant Professor, Biosystems Engineering Department, Faculty of Agriculture, Tarbiat Modares University, Tehran, Iran
Mohammad Hadi Khoshtaghaza
Affiliation:
Professor, Biosystems Engineering Department, Faculty of Agriculture, Tarbiat Modares University, Tehran, Iran
*
Corresponding author: Saeid Minaei; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Automatic detection and removal of weeds is a challenging task that requires precise sensors. While crops and weeds possess similar features in terms of appearance, they can be discriminated based on spectral information. This can be done because any object has its own specific spectral signature based on its physical structure and chemical contents. This study examined the use of wavelet transform and deep learning for discrimination of weeds from crops. A total of 626 spectral reflectances in the range of 380 to 1,000 nm were obtained for three crops (cucumber [Cucumis sativus L.], tomato [Solanum lycopersicum L.], and bell pepper [Capsicum annuum L.]) and five different weeds (bindweed [Convolvulus spp.], purple nutsedge [Cyperus rotundus L.], narrowleaf plantain [Plantago lanceolata L.], common cinquefoil [Potentilla simplex Michx.], and garden sorrel [Rumex acetosa L.]). Morse wavelet was employed to decompose the spectra and extract the scalograms, which are the RGB representations of the spectral data. Two deep convolutional neural networks (i.e., GoogLeNet and SqueezNet) were employed for the recognition of crops and weeds. In addition, six common classifiers, including linear discriminant analysis, quadratic discriminant analysis, linear support vector machine, quadratic support vector machine, artificial neural networks, and k-nearest neighbors classifier (KNN), were used for the task of crop/weed discrimination to build the comparison with the proposed method. The error of prediction gradually decreased, and a 100% correct classification was achieved after 258 iterations. Analysis showed that SqueezNet provided classification of 100% accuracy, while GoogLeNet’s accuracy was 97.8% for the test set. Among the common classifiers, KNN provided the highest accuracy (i.e., 100%). This study showed that the proposed method can be successfully utilized for classification of crops and weeds.

Type
Research Article
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of Weed Science Society of America

Introduction

Weeds, which are considered to be any unwanted plants in the field, not only affect the crops around them, but can also jeopardize agricultural areas. Weeds compete for nutrients, soil, water, and space and should be detected and eliminated at an early stage. As the most important crop protection strategy, weed control can lead to a 20% increase in yield (Buddenhagen et al. Reference Buddenhagen, Gunnarsson, Rolston, Chynoweth, Bourdot and James2020). Traditional weed control using chemicals is expensive and can be reduced by more than 50% if novel technologies are employed (Gerhards et al. Reference Gerhards, Andujar Sanchez, Hamouz, Peteinatos, Christensen and Fernandez-Quintanilla2022). The use of herbicides has environmental impacts, including potentially polluting soil, surface water, and groundwater (Agüera-Vega et al. Reference Agüera-Vega, Agüera-Puntas, Agüera-Vega, Martínez-Carricondo and Carvajal-Ramírez2021; Akbarzadeh et al. Reference Akbarzadeh, Paap, Ahderom, Apopei and Alameh2018; Le et al. Reference Le, Apopei and Alameh2019; Sabzi and Abbaspour-Gilandeh Reference Sabzi and Abbaspour-Gilandeh2018; Slaven et al. Reference Slaven, Koch and Borger2023; Sunil et al. Reference Sunil, Koparan, Ahmed, Zhang, Howatt and Sun2022). However, weed detection and control constitute a complicated affair, as crops and weeds are quite similar in many aspects, including color features, leaf shapes and forms, leaf patterns, and leaf/plant dimensions (Iqbal et al. Reference Iqbal, Khaliq and Cheema2020; Liu et al. Reference Liu, Li, Li, You, Yan and Tong2019; Sodjinou et al. Reference Sodjinou, Mohammadi, Mahama and Gouton2021). Recently, weed detection and separation from crops have advanced rapidly and have benefited from modern solutions. These recent solutions include satellite-based detection (Rasmussen et al. Reference Rasmussen, Azim and Nielsen2021; Shanmugam et al. Reference Shanmugam, Assunção, Mesquita, Veiros and Gaspar2020; Shendryk et al. Reference Shendryk, Rossiter-Rachor, Setterfield and Levick2020), drone-based detection (Esposito et al. Reference Esposito, Crimaldi, Cirillo, Sarghini and Maggio2021; Liang et al. Reference Liang, Yang and Chao2019; Revanasiddappa et al. Reference Revanasiddappa, Arvind and Swamy2020), hyperspectral imaging (Che’Ya et al. Reference Che’Ya, Dunwoody and Gupta2021; Li et al. Reference Li, Al-Sarayreh, Irie, Hackell, Bourdot, Reis and Ghamkhar2021; Pignatti et al. Reference Pignatti, Casa, Harfouche, Huang, Palombo and Pascucci2019; Sulaiman et al. Reference Sulaiman, Che’Ya, Mohd-Roslim, Juraimi, Mohd-Noor and Fazlil-Ilahi2022), and multispectral imaging (Barrero and Perdomo Reference Barrero and Perdomo2018; Osorio et al. Reference Osorio, Puerto, Pedraza, Jamaica and Rodríguez2020).

Spectral detection can be a promising solution for crop/weed separation based on the concept that every object in the nature has its own spectral signature (Falcioni et al. Reference Falcioni, Moriwaki, Pattaro, Furlanetto, Nanni and Antunes2020; Putra Reference Putra2020). This spectral signature comes from the physical properties and the nutrient, chemical, and water contents. These properties influence the amount of absorption and reflection of electromagnetic waves that can be used for distinguishing crops and weeds. The use of spectral data in agricultural applications has been extensively researched. Applications include crop/weed discrimination (Fletcher et al. Reference Fletcher, Reddy and Turley2016; Gómez-Casero et al. Reference Gómez-Casero, Castillejo-González, García-Ferrer, Peña-Barragán, Jurado-Expósito, García-Torres and López-Granados2010; Kamath et al. Reference Kamath, Balachandra and Prabhu2020; Subeesh et al. Reference Subeesh, Bhole, Singh, Chandel, Rajwade, Rao, Kumar and Jat2022), disease detection (Cordon et al. Reference Cordon, Andrade, Barbara and Romero2021; Mahlein et al. Reference Mahlein, Steiner, Dehne and Oerke2010, Reference Mahlein, Rumpf, Welke, Dehne, Plümer, Steiner and Oerke2013; Shafri et al. Reference Shafri, Anuar, Seman and Noor2011), ripeness estimation (Silalahi et al. Reference Silalahi, Reaño, Lansigan, Panopio and Bantayan2016), estimation of plant nutrient deficiencies (Abdulridha et al. Reference Abdulridha, Ampatzidis, Ehsani and de Castro2018; Ayala-Silva and Beyl Reference Ayala-Silva and Beyl2005), classification of grass-dominated habitats (Bradter et al. Reference Bradter, O’Connell, Kunin, Boffey, Ellis and Benton2020), plant species/varieties discrimination (Manevski et al. Reference Manevski, Manakos, Petropoulos and Kalaitzidis2011; Prospere et al. Reference Prospere, McLaren and Wilson2014; Ullah et al. Reference Ullah, Schlerf, Skidmore and Hecker2012; Vaiphasa et al. Reference Vaiphasa, Skidmore, de Boer and Vaiphasa2007; Yu et al. Reference Yu, Schumann, Sharpe, Li and Boyd2020), distinguishing herbicide-resistant plants (Jones et al. Reference Jones, Austin, Dunne, Leon and Everman2023), and classifying forest logging residue (Acquah et al. Reference Acquah, Via, Billor and Fasina2016). In all these applications, the discrimination or detection technique was built using the specific spectral reflection of plants or plant organs. There has been one or several wavelengths in which the reflectance of electromagnetic energy has been different for the healthy crop and weed, diseased crop, or malnourished crop. In this regard, the use of hyperspectral data analysis can provide promising tools that are fast and generalizable and can be integrated in the analyses with semi-automated procedures (Hennessy et al. Reference Hennessy, Clarke and Lewis2020). Another advantage of spectral datasets is the potential for detailed analysis of spectral reflectance which comes from biochemical and biophysical attributes of plants. However, a disadvantage of hyperspectral analysis is the processing of the data, which can be difficult due to the high dimensionality of the data. Also, the excessive demand for obtaining and providing sufficient samples and the high cost of spectral measurements are among the limitations of hyperspectral technologies (Adelabu et al. Reference Adelabu, Mutanga, Adam and Sebego2013). While spectral data have been quite critical for species discrimination, a disadvantage is the redundant information within high-resolution spectral data (Nagasubramanian et al. Reference Nagasubramanian, Jones, Singh, Sarkar, Singh and Ganapathysubramanian2019).

Among the techniques that mostly have been used for the analysis of spectral data and classifications are k-nearest neighbors classifier (KNN), linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), principal component analysis, Normalized Difference Vegetation Index (NDVI), Fourier transform, Jeffries–Matusita distance measure, support vector machines (SVMs), and artificial neural networks (ANNs) (Bell and Baranoski Reference Bell and Baranoski2004; Durgante et al. Reference Durgante, Higuchi, Almeida and Vicentini2013; Longchamps et al. Reference Longchamps, Panneton, Samson, Leroux and Thériault2010; Louargant et al. Reference Louargant, Jones, Faroux, Paoli, Maillot, Gée and Villette2018; Noble and Brown Reference Noble and Brown2009; Strothmann et al. Reference Strothmann, Ruckelshausen, Hertzberg, Scholz and Langsenkamp2017; Talaviya et al. Reference Talaviya, Shah, Patel, Yagnik and Shah2020; Zarco-Tejada et al. Reference Zarco-Tejada, Camino, Beck, Calderon, Hornero, Hernández-Clemente, Kattenborn, Montes-Borrego, Susca, Morelli and Gonzalez-Dugo2018). Symonds et al. (Reference Symonds, Paap, Alameh, Rowe and Miller2015) developed a real-time plant discrimination system based on discrete reflectance spectroscopy. In this study, three different laser diodes (i.e., 635, 685, and 785 nm) were used. It was reported that the system could make a practical discrimination for a vehicle speed of 3 km h−1. In a recent work, Nidamanuri (Reference Nidamanuri2020) used machine learning to discriminate tea (Camellia sinensis (L.) Kuntze) plant varieties. Canopy-level hyperspectral reflectance measurements were acquired for tea and natural plant species in the range of 350 to 2,500 nm. The classifier could discriminate six out of nine tea plant varieties successfully, with accuracies between 75% and 80%.

Recently, attention has been paid to the implementation and improvement of convolutional neural networks (CNNs) for classification purposes. The good thing about CNNs is that they learn features on their own through the network training process, which permits them to discriminate between unseen samples in high performance rate (Garibaldi-Márquez et al. Reference Garibaldi-Márquez, Flores, Mercado-Ravell, Ramírez-Pedraza and Valentín-Coronado2022). Andrea et al. (Reference Andrea, Daniel and Misael2017) discriminated between maize (Zea mays L.) and weed using CNNs. They verified LeNET, AlexNet, cNET, and sent architectures, and cNET resulted in the best performance in terms of accuracy (95.05%) and processing time (2.34 ms). Xi et al. (Reference Xi, Li, Su, Tian, Zhang, Sun, Long, Wan and Qian2020) proposed a network called MmNet consisting of the local response normalization of AlexNet, GoogLeNet, and VGG inception models. The proposed MmNet led to an accuracy of 94.50% and a time cost of 10.369 s. Nguyen et al. (Reference Nguyen, Sagan, Maimaitiyiming, Maimaitijiang, Bhadra and Kwasniewski2021) used SVM and random forest (RF) techniques for disease detection in grapevine (Vitis vinifera L.) plants based on hyperspectral data in the range of 400 to 1,000 nm. It was observed that the SVM classifier performed better for vegetation index-wise classification, while the RF classifier showed better results for pixel-wise and image-wise classification. Garibaldi-Márquez et al. (Reference Garibaldi-Márquez, Flores, Mercado-Ravell, Ramírez-Pedraza and Valentín-Coronado2022) studied the use of shallow and deep learning techniques for the discrimination of crop and weeds. RGB images were captured in field conditions and different locations. The images were obtained in cornfields with three different weeds present. VGG16, VGG19, and Xception models were trained and tested, leading to accuracies of 97.93%, 97.44%, and 97.24%, respectively. In a recent work, Wang et al. (Reference Wang, Chen, Ju, Lin, Wang and Wang2023) took advantage of CNNs for the classification of weed species based on hyperspectral (HS) images. The study was based on a database of HS images of 40 weed species. Preprocessing was applied to the data, and the best accuracy of 98.15% was achieved.

The use of deep learning techniques and spectral data can facilitate the detection of weeds in agricultural fields. This will lead to the precise detection of weeds using a noncontact and noninvasive method. This study evaluates a method based on wavelet transform and deep networks for the separation of crops and weeds and compares it with the traditional classifiers.

Materials and Methods

Instrumentation and Measurements

Three crops, namely, cucumber (Cucumis sativus L.), tomato (Solanum lycopersicum L.), and bell pepper (Capsicum annuum L.) and five weed species including bindweed (Convolvulus spp.), purple nutsedge (Cyperus rotundus L.), narrowleaf plantain (Plantago lanceolata L.), common cinquefoil (Potentilla simplex (Michx.), and garden sorrel (Rumex acetosa L.) were used for this study. Leaves were taken from different parts of young plants of different sizes. Samples of plants were taken from plants in vegetative and flowering stages of growth. The number of samples for each growth stage was almost the same. For each plant, more than 70 samples were obtained, for a total of 626 samples. The plants (with the soil and roots) were removed from the farm and quickly transferred to the laboratory. All measurements were done under the same conditions. For illumination, one lamp of type A and one halogen lamp were used (Figure 1). Spectral reflectances in the range 380 to 1,000 nm were obtained using the spectroradiometer Specbos 1211 (JETI Technische Instrumente GmbH, Jena, Germany). This machine is a noncontact spectroradiometer that is connected to the PC via a USB port. The optical bandwidth of this spectroradiometer is 4.5 nm, and the measuring range for the illuminance is 1 to 1,500,000 Lx. As shown in Figure 1, the spectroradiometer was set at an angle of 90° in relation to the leaves, and the standard observer of 2° was used for the measurements.

Figure 1. The measurement system and illumination setup: 1, light source; 2, spectroradiometer; 3, sample; 4, laptop computer.

Preprocessing

Statistical Pretreatment

Preprocessing for high-dimensional data normally leads to better discovery of relationships and trends of the data. In this regard, first, the beginning of the spectra that was noisy was removed. Then, the data were denoised using a smoothing filter (i.e., Savitzky-Golay filter). Next, standard normal variate for applying normalization was used. Afterward, first derivative and mean centering were applied to the data.

Continuous Wavelet Transform

The preprocessing was inefficient for the data for the convolutional neural networks, as explained in the next section. In this regard, continuous wavelet transform (CWT) was used for preprocessing. CWT is used for the decomposition of a signal into wavelets. It is a perfect tool for mapping the changing properties of nonstationary signals. The basic functions of CWT are the scaled and shifted versions of the mother wavelet. The formula used for this transformation is as follows:

([1]) $$C\left( {a,{\rm{\tau }}} \right) = \int {1 \over a}{\rm{\psi }}\left( {{{t - {\rm{\tau }}} \over a}} \right)x\left( t \right)dt$$

Based on Equation 1, the wavelet ${\rm{\psi }}$ (t) is shifted by ${\rm{\tau }}$ and scaled by factor a $.$ In this study, a Morse wavelet having the following formula was used:

([2]) $${\rm{\psi }}\left( \omega \right) = U\left( {\rm{\omega }} \right){a_{p,{\rm{\gamma }}}}{{\rm{\omega }}^{{{{p^2}} \over {\rm{\gamma }}}}}{e^{ - {{\rm{\omega }}^{\rm{\gamma }}}}}$$

where $U\left( {\rm{\omega }} \right)$ represents the unit step, and a is a normalizing constant. ${\rm{\Gamma }}$ , which controls the symmetry of the wavelet, was set to 3; and p is the square root of the time–bandwidth product being in proportion to the wavelet duration was selected as $\sqrt {60} $ . Hence, CWT was applied on all spectral reflectances, and a database of scalograms was constructed. These scalograms in the form 2D images were used for training the network and classification. Figure 2 provides an example of a scalogram randomly chosen from pepper plant samples.

Figure 2. A spectral reflectance of a bell pepper plant leaf (A) and its scalogram of continuous wavelet transform (CWT) colored as an RGB image (B).

Classification Techniques

Common Classifiers

For comparison purposes, six common classifiers were employed for the task of discrimination of crops/weeds. These techniques include LDA, QDA, linear support vector machine (LSVM), quadratic support vector machine (QSVM), ANNs, and fine k-nearest neighbors (FKNN). Table 1 presents the technical details of these methods.

Table 1. Technical details of the common classifiers used for discrimination of plants/weeds

Convolutional Neural Network Classifiers

GoogLeNet was utilized in this study to verify its ability to classify crops and weeds based on the spectral data. This pretrained network was used for two reasons. First, this is quite a strong network trained with a large database consisting of over 1,000 different categories. Second, use of this network saved time, as it eliminated the trial and error of building new networks. In addition, a pretrained network can be used by other researchers working in the same field.

GoogLeNet is a convolutional network that is 22 layers deep with 7 pooling layers included. There are nine inception modules stacked linearly in total. The training uses an asynchronous stochastic gradient descent with a momentum of 0.9. Initial learning rate of 1 × 10−4, gradient threshold method of l2 norm, and maximum epochs of 20 were used for building the network. The inputs for GoogLeNet, which are the outputs of CWT, need to be RGB image arrays 224 × 224 × 3. To avoid overfitting, a dropout layer was employed that randomly sets input elements to zero at a level of probability. The flowchart of the proposed method is shown in Figure 3. Morse wavelet was applied to the signals, and scalograms were extracted. Scalograms are the RGB representations of the spectral reflectances. Then, these RGB images were used for retraining the CNN. Finally, the classifier was built to carry out the classification task on new samples.

Figure 3. Block diagram of the proposed algorithm for the separation of crops and weeds.

SqueezeNet is a CNN having an 18-layer depth. Like GoogLeNet, it is pretrained for more than 1,000 categories. The size of the input image is 227 × 227 × 3. The weighted learning factor was set to 10. The last learnable layer was replaced with a convolutional layer with two filters. A bias learning factor of 10 was chosen. For training, the mini-batch size, maximum epochs, initial learning rate, and learning optimizer method were chosen as 10, 15, 3 × 10−4, and a gradient descent with momentum, respectively.

Programming and Analysis

In this study, the data were randomly divided into three groups of training, validation, and testing. Therefore, 70% was used for training, 15% for validation, and 15% for testing, which was not presented to the algorithms while training (i.e., unseen data). All the programming was done using MATLAB (R2019b, MathWorks Inc., Massachusetts, USA) and MS Excel (Microsoft Office Excel, Washington, USA, 2016) software. The processing and analysis were performed on a PC with an Intel® Core™ i7 processor and 16 GB of RAM.

Results and Discussion

The spectral reflectance of leaves of crops and weeds is remarkably similar. This makes the discrimination of crops and weeds difficult. Figure 4 presents the spectral reflectance of bell pepper plant and five weeds. It is observed that use of techniques for the reduction of data volume or use of efficient classification techniques is necessary. As seen in Figure 4, most of the relevant information can be obtained from 500 to 750 nm. In the blue area of the spectrum, there are not many changes in the spectral reflectances, and absorbance is close to 1. In a study on the discrimination of weeds (i.e., spurge [Euphorbia spp.] and purple loosestrife [Lythrum salicaria L.]) from the surrounding vegetation, Hom et al. (Reference Hom, Bajwa, Lym and Nowatzki2020) found the significant spectral bands in the same regions. Sayed Yones et al. Reference Sayed Yones, Amin Aboelghar, Ali Khdery, Massoud Ali, Hussien Salem, Farag and Ahmed Mahmoud Mamon2019 also observed that a good discrimination of healthy/infested plants could be obtained in green and red parts of spectrum for monitoring of sugar beet (Beta vulgaris L.) infestation. In addition, in a large part of the near-infrared (NIR) area, there is little fluctuation and most of the energy has been reflected. This is expected, as plants use the visible part of the spectrum for photosynthesis and other metabolic processes (Hua et al. Reference Hua, Lin, Guo, Fan, Zhang, Yang, Hu and Zhu2019; Mahlein et al. Reference Mahlein, Rumpf, Welke, Dehne, Plümer, Steiner and Oerke2013; Su Reference Su2020).

Figure 4. The spectral reflectance of plant leaves: (A) bell pepper, (B) Convolvulus spp., (C) Cyperus rotundus, (D) Plantago lanceolata, (E) Potentilla simplex, and (F) Rumex acetosa.

Pretreatment

To remove random noise in the data, the spectra were smoothed. This pretreatment has been reported to be efficient in other works (Huang et al. Reference Huang, Li, Yang, Wang, Li, Zhang, Wan, Qiao and Qian2021; Jiang et al. Reference Jiang, Steven, He, Chen, Du and Guo2015; Yang et al. Reference Yang, Yang, Hao, Xie and Li2019). Afterward, they were normalized, followed by first derivative and mean centering. These techniques help to avoid irrelevant information and to better represent data trends (Türker-Kaya and Huck Reference Türker-Kaya and Huck2017). Recently, Amirvaresi et al. (Reference Amirvaresi, Nikounezhad, Amirahmadi, Daraei and Parastar2021) reported that mean centering and second derivative resulted in the best performance for saffron (Crocus sativus L.) authentication and adulteration detection based on NIR and mid-infrared (MIR) spectroscopy. In this regard, the choice of preprocessing and combination of the techniques is a critical step. This preprocessing led to a remarkable diagram representing the differences of the spectra of crops and weeds. As Figure 5 shows, the average spectrum of crops has significant zones that are different from those of weeds. The peak of the spectrum of crops is at 735 nm, with the trough of the weed spectrum at this point, while the peak for the weeds is at 695 nm. Therefore, the spectra preprocessed by smoothing were used as the input for six traditional classifiers.

Figure 5. The average of preprocessed spectra of crops and weeds. The spectra were smoothed, normalized, and mean-centered.

Traditional and Deep Classifiers

CWT was used for preprocessing for the deep networks. Table 2 presents the validation and test accuracies achieved by each classifier. As can be observed, the proposed method using SqueezNet has led to complete separation of crops and weeds both for validation and test samples. However, in case of GoogLeNet, accuracy of 97.8% was achieved. It can be noted that among traditional classifiers, FKNN led to complete separation. Next, the LDA and QSVM represented better performance, both of them had a 5-fold validation accuracy of 99.6% and a test accuracy of 100%. The QDA technique ranked last, with validation and test accuracies equal to 82.5% and 86.6%, respectively. Comparison of the training time shows that the GoogLeNet (106.63 min) and then the SqueezNet (26.85 min) required more time for training (Table 2). The great difference between the deep networks and traditional classifiers is that these networks involve the conversion of spectra to images and then use the images for training, which takes a significant time.

Table 2. Comparison of validation and test accuracies for different classifiers

a LDA, linear discriminant analysis; QDA, quadratic discriminant analysis; LSVM, linear support vector machine; QSVM, quadratic support vector machine; ANN, artificial neural network; FKNN, fine k-nearest neighbors.

Compared with previous research, the performance of SqueezNet and FKNN has been remarkable. Nidamanuri (Reference Nidamanuri2020) utilized ANNs for the discrimination of tea plant varieties using spectral discrimination. Here, ANN was compared with other methods, including KNN, LDA, SVMs, and normalized spectral similarity score. It was observed that SVM, as a machine learning technique, led to higher classification accuracies. Next, it was LDA that provided a high-accuracy performance. It was reported that six out of nine varieties could be discriminated with accuracies ranging between 75% and 80%. The inclusion of natural tea plants increased the variability of the spectral data and reduced the classification accuracy. Shirzadifar et al. (Reference Shirzadifar, Bajwa, Mireei, Howatt and Nowatzki2018) used soft independent modeling of the class analogy method for discrimination of three weeds based on spectral data. It was observed that the use of preprocessing was necessary for achieving proper results. Five preprocessing methods were evaluated, and second derivative was effective. The authors reported NIR area as the best area for the discrimination. The proposed method could discriminate three weed species with 100% accuracy for 63 samples. Jiang et al. (Reference Jiang, Zhang, Qiao, Zhang, Zhang and Song2020) proposed a graph convolutional network for crop and weed recognition. Their network achieved accuracies of 97.80%, 99.37%, 98.93% and 96.51% for four different datasets and had better results compared with AlexNet, VGG16, and ResNet-101. De Souza et al. (Reference De Souza, do Amaral, de Medeiros Oliveira, Coutinho and Netto2020) studied the differentiation of sugarcane (Saccharum officinarum L.) from weeds based on spectral data and using soft independent modeling. They observed that the selection of only four significant bands in VIS-NIR could lead to the same results as the whole spectrum. Their method obtained an accuracy of 97.4%. In a recent work, Su et al. (Reference Su, Yi, Coombes, Liu, Zhai, McDonald-Maier and Chen2022) mapped blackgrass (Alopecurus myosuroides Huds.) in wheat (Triticum aestivum L.) fields using multispectral images and deep learning. For the classification task, RF with Bayesian hyperparameter optimization was used. This work led to an accuracy of 93%, and the most discriminant spectral index was composed of green-NIR.

The training process with SqueezNet in the present study shows that the original training had been very well performed (Figure 6). In this figure, the most important element is the validation curve, which has been improving and following the training data. It can be seen that from iteration 258, the network could remarkably discriminate the crops and weeds (i.e., 100% accuracy). Table 3 provides the details of training of the network. As the table indicates, in the 6th epoch, when validation accuracy reaches 100%, the validation loss is quite small, and in the 13th epoch, it reaches 0.0003. The mini-batch accuracy, which represents the accuracy of training for mini-batches or subbatches (if the whole dataset is considered to be a batch), has also been provided. The mini-batch accuracy shows that the training gets stable after the fourth epoch. Figure 7 presents the amount of loss function for each iteration. Minimizing loss function is based on the gradient descent algorithm. In every iteration, the gradient of the loss function is obtained and evaluated, and then the weights for the descent algorithm are updated. In the figure, it can be seen that the training has been going uniformly better, and the loss value for the validation data has been gradually decreasing while following the training data, showing that the learning process has been correctly performed.

Figure 6. Diagram showing the SqueezNet training process and accuracy per iteration.

Table 3. The details of the training process of the network for each learning epoch

Figure 7. Diagram of loss function values per iteration during the SqueezNet training process.

The confusion matrix describing the performance of SqueezNet has been provided in Figure 8. In this matrix, output class is the predicted classification, and the target class refers to the actual classes. It can be seen that the algorithm has randomly chosen 34 crop samples and 61 weed samples as test spectra that all have been classified correctly. Akbarzadeh et al. (Reference Akbarzadeh, Paap, Ahderom, Apopei and Alameh2018) utilized SVM for the discrimination of crops and weeds based on spectral data. They reported that their gaussian SVM algorithm could classify the plants with a success rate of 97%. They obtained spectral data in three wavelengths and combined the SVM with the Normalized Difference Vegetation Index (NDVI). Rock et al. (Reference Rock, Gerhards, Schlerf, Hecker and Udelhoven2016) performed the discrimination of eight plant species using emissive thermal infrared spectroscopy. The hyperspectral images were acquired in the range of 7.8 to 11.6 ${\rm{\mu m}}$ at 40-nm resolution. The overall accuracy of discrimination obtained was equal to 92.26%. In a recent work, Jin et al. (Reference Jin, Bagavathiannan, Maity, Chen and Yu2022) compared GoogLeNet, MobileNet-v3, ShufeNet-v2, and VGGNet for the discrimination of weeds. It was observed that ShufeNet-v2 and VGGNet showed higher overall accuracies (≥0.999). However, among the classifiers, ShufeNet-v2 and MobileNet-v3 were remarkably faster than GoogLeNet and VGGNet.

Figure 8. Confusion matrix of the networks representing true and false classifications: (A) GoogLeNet and (B) SqueezNet.

An advantage of the spectral data for the discrimination of plants is that it is light independent, as the spectral reflectance of each object is specific and acts as a fingerprint. Therefore, the spectral responses of plants can be measured on-farm and used for discrimination purposes in agricultural applications. Other techniques that have recently been employed for plant discrimination and weed detection are multispectral/hyperspectral imaging, 3D modeling of plants, and LiDAR (Sandoval et al. Reference Sandoval, Gor, Ramallo, Sfer, Colombo, Vilaseca, Pujol, Caivano and Buera2012; Andújar et al. Reference Andújar, Calle, Fernández-Quintanilla, Ribeiro and Dorado2018; Jarocińska et al. Reference Jarocińska, Kopeć, Tokarska-Guzik and Raczko2021; Jin et al. Reference Jin, Bagavathiannan, Maity, Chen and Yu2022; Reiser et al. Reference Reiser, Vázquez-Arellano, Paraforos, Garrido-Izard and Griepentrog2018; Su et al. Reference Su, Fennimore and Slaughter2019). Barrero and Perdomo (Reference Barrero and Perdomo2018) fused multispectral and RGB images for weed detection. As the result of their analysis, it was observed that the Normalized Green–Red Difference Index provided better features than NDVI. The preprocessing included transformation of RGB images to hue, intensity, and saturation and usage of Haar transformation. The best weed detection performance was obtained using the neural network for the percentage of detected weed area of between 80% and 108%. In the research work conducted by Özlüoymak (Reference Özlüoymak2020) on the usage of stereo-imaging for the detection of crops and weeds, artificial plants, including one crop and six weeds, were utilized. The proposed technique led to R2 values of 0.962 and 0.978 for the detection of crops and weeds, respectively. In a recent study, Shahbazi et al. (Reference Shahbazi, Ashworth, Callow, Mian, Beckie, Speidel, Nicholls and Flower2021) studied the ability of light detection and ranging (LiDAR) sensors for the detection of weeds. It was reported that the ability to detect the weeds at different scanning distances from the sensor was significantly dependent on the size of the target and its orientation toward the LiDAR. The study showed that LiDAR could detect 100% of the weeds based on their height differences with the plant canopy. Tao and Wei (Reference Tao and Wei2022) used a hybrid classifier based on CNN-SVM for weed recognition. For the deep CNN, the VGG network, which was trained based on true-color images, was employed. The VGG-SVM classifier resulted in an accuracy of 92.1% for the separation of winter rape (Brassica napus L.) seedlings and four weeds.

This study showed that spectral data are a proper tool for the discrimination of crops and weeds. The spectral reflectances of leaves of three crops (cucumber, tomato, and pepper) and five weeds (Convolulus spp., C. rotundus, P. lanceolata, P. simplex, and R. acetosa) were obtained in the wavelength range of 380 to 1,000 nm. The classification performance of two deep CNNs and six common classifiers was investigated and compared. Two types of preprocessing (i.e., mathematical pretreatment and wavelet transform) were used for achieving the best performance of the techniques. It was observed that the utilization of continuous wavelet transform for dimensionality reduction of spectral data was quite successful. Results of analysis showed that the proposed method using SqueezNet discriminated crops and weeds with 100% accuracy. This study demonstrates successful use of spectral data for accurate discrimination of various crops and weeds based on their spectral signatures. Future studies may consider the generalization of the technique. The usage of a bigger dataset with many different types of crops and weeds will lead to the development of a robust classifier for crop/weed separation. It is suggested that the classifier be integrated into real-time weed detection systems for the evaluation of the technique in the field.

Acknowledgments

The authors cordially appreciate the support of ImViA laboratory, University of Burgundy, France, for the instrumentation and laboratory facilities provided for this research. The work is part of a joint Ph.D study between Tarbiat Modares University, Tehran, Iran, and the University of Burgundy, Dijon, France.

Funding

This research received no specific grant from any funding agency or the commercial or not-for-profit sectors.

Competing interests

The authors declare no competing interests.

Footnotes

Associate Editor: William Vencill, University of Georgia

References

Abdulridha, J, Ampatzidis, Y, Ehsani, R, de Castro, AI (2018) Evaluating the performance of spectral features and multivariate analysis tools to detect laurel wilt disease and nutritional deficiency in avocado. Comput Electron Agric 155:203211 CrossRefGoogle Scholar
Acquah, GE, Via, BK, Billor, N, Fasina, OO (2016) Eckhardt LG. Identifying plant part composition of forest logging residue using infrared spectral data and linear discriminant analysis. Sensors 16:13751390 CrossRefGoogle ScholarPubMed
Adelabu, S, Mutanga, O, Adam, E, Sebego, R (2013) Spectral discrimination of insect defoliation levels in mopane woodland using hyperspectral data. IEEE J Sel Top Appl Earth Obs Remote Sens 7:177186 CrossRefGoogle Scholar
Agüera-Vega, F, Agüera-Puntas, M, Agüera-Vega, J, Martínez-Carricondo, P, Carvajal-Ramírez, F (2021) Multi-sensor imagery rectification and registration for herbicide testing. Measurement 175:109049 CrossRefGoogle Scholar
Akbarzadeh, S, Paap, A, Ahderom, S, Apopei, B, Alameh, K (2018) Plant discrimination by Support Vector Machine classifier based on spectral reflectance. Comput Electron Agric 148:250258 CrossRefGoogle Scholar
Amirvaresi, A, Nikounezhad, N, Amirahmadi, M, Daraei, B, Parastar, H (2021) Comparison of near-infrared (NIR) and mid-infrared (MIR) spectroscopy based on chemometrics for saffron authentication and adulteration detection. Food Chem 344:128647 CrossRefGoogle ScholarPubMed
Andrea, CC, Daniel, BB, Misael, JB (2017) Precise weed and maize classification through convolutional neuronal networks. Pages 16 in 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), 16–20 October 2017. Salinas, Ecuador: IEEE Google Scholar
Andújar, D, Calle, M, Fernández-Quintanilla, C, Ribeiro, Á, Dorado, J (2018) Three-dimensional modeling of weed plants using low-cost photogrammetry. Sensors 18:1077 CrossRefGoogle ScholarPubMed
Ayala-Silva, T, Beyl, CA (2005) Changes in spectral reflectance of wheat leaves in response to specific macronutrient deficiency. Adv Space Res 35:305317 CrossRefGoogle ScholarPubMed
Barrero, O, Perdomo, SA (2018) RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precis Agric 19:809822 CrossRefGoogle Scholar
Bell, IE, Baranoski, GV (2004) Reducing the dimensionality of plant spectral databases. IEEE Trans Geosci Remote Sens 42:570576 CrossRefGoogle Scholar
Bradter, U, O’Connell, J, Kunin, WE, Boffey, CW, Ellis, RJ, Benton, TG (2020) Classifying grass-dominated habitats from remotely sensed data: the influence of spectral resolution, acquisition time and the vegetation classification system on accuracy and thematic resolution. Sci Total Environ 711:134584 CrossRefGoogle ScholarPubMed
Buddenhagen, CE, Gunnarsson, M, Rolston, P, Chynoweth, RJ, Bourdot, G, James, TK (2020) Costs and risks associated with surveying the extent of herbicide resistance in New Zealand. NZ J Agric Res 63:430448 CrossRefGoogle Scholar
Che’Ya, NN, Dunwoody, E, Gupta, M (2021) Assessment of weed classification using hyperspectral reflectance and optimal multispectral UAV imagery. Agronomy 11:1435 CrossRefGoogle Scholar
Cordon, G, Andrade, C, Barbara, L, Romero, AM (2021) Early detection of tomato bacterial canker by reflectance indices. Inf Process Agric 9:184194 Google Scholar
De Souza, MF, do Amaral, LR, de Medeiros Oliveira, SR, Coutinho, MAN, Netto, CF (2020) Spectral differentiation of sugarcane from weeds. Biosyst Eng 190:4146 CrossRefGoogle Scholar
Durgante, FM, Higuchi, N, Almeida, A, Vicentini, A (2013) Species spectral signature: discriminating closely related plant species in the Amazon with near-infrared leaf-spectroscopy. For Ecol Manag 291:240248 CrossRefGoogle Scholar
Esposito, M, Crimaldi, M, Cirillo, V, Sarghini, F, Maggio, A (2021) Drone and sensor technology for sustainable weed management: a review. Chem Biol Technol Agric 8:111 CrossRefGoogle Scholar
Falcioni, R, Moriwaki, T, Pattaro, M, Furlanetto, RH, Nanni, MR, Antunes, WC (2020) High resolution leaf spectral signature as a tool for foliar pigment estimation displaying potential for species differentiation. J Plant Physiol 249:153161 CrossRefGoogle ScholarPubMed
Fletcher, RS, Reddy, KN, Turley, RB (2016) Spectral discrimination of two pigweeds from cotton with different leaf colors. Am J Plant Sci 7:21382151 CrossRefGoogle Scholar
Garibaldi-Márquez, F, Flores, G, Mercado-Ravell, DA, Ramírez-Pedraza, A, Valentín-Coronado, LM (2022) Weed classification from natural corn field-multi-plant images based on shallow and deep learning. Sensors 22:3021 CrossRefGoogle ScholarPubMed
Gerhards, R, Andujar Sanchez, D, Hamouz, P, Peteinatos, GG, Christensen, S, Fernandez-Quintanilla, C (2022) Advances in site-specific weed management in agriculture—a review. Weed Res 62:123133 CrossRefGoogle Scholar
Gómez-Casero, MT, Castillejo-González, IL, García-Ferrer, A, Peña-Barragán, JM, Jurado-Expósito, M, García-Torres, L, López-Granados, F (2010) Spectral discrimination of wild oat and canary grass in wheat fields for less herbicide application. Agron Sustain Dev 30:689699 CrossRefGoogle Scholar
Hennessy, A, Clarke, K, Lewis, M (2020) Hyperspectral classification of plants: a review of waveband selection generalisability. Remote Sens 12:113 CrossRefGoogle Scholar
Hom, KMH, Bajwa, SG, Lym, RG, Nowatzki, JF (2020) Discrimination of leafy spurge (Euphorbia esula) and purple loosestrife (Lythrum salicaria) based on field spectral data. Weed Technol 34:250259 CrossRefGoogle Scholar
Hua, W, Lin, Z, Guo, D, Fan, G, Zhang, Y, Yang, K, Hu, Q, Zhu, L (2019) Simulated long-term vegetation–climate feedbacks in the Tibetan Plateau. Asia-Pac J Atmos Sci 55:4152 CrossRefGoogle Scholar
Huang, Y, Li, J, Yang, R, Wang, F, Li, Y, Zhang, S, Wan, F, Qiao, X, Qian, W (2021) Hyperspectral imaging for identification of an invasive plant Mikania micrantha Kunth. Front Plant Sci 12:626516 CrossRefGoogle ScholarPubMed
Iqbal, N, Khaliq, A, Cheema, ZA (2020) Weed control through allelopathic crop water extracts and S-metolachlor in cotton. Inf Process Agric 7:165172 Google Scholar
Jarocińska, A, Kopeć, D, Tokarska-Guzik, B, Raczko, E (2021) Intra-annual variabilities of Rubus caesius L. discrimination on hyperspectral and LiDAR data. Remote Sens 13:107129 CrossRefGoogle Scholar
Jiang, J, Steven, MD, He, R, Chen, Y, Du, P, Guo, H (2015) Identifying the spectral responses of several plant species under CO2 leakage and waterlogging stresses. Int J Greenhouse Gas Control 37:111 CrossRefGoogle Scholar
Jiang, H, Zhang, C, Qiao, Y, Zhang, Z, Zhang, W, Song, C (2020) CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput Electron Agric 174:105450 CrossRefGoogle Scholar
Jin, X, Bagavathiannan, M, Maity, A, Chen, Y, Yu, J (2022) Deep learning for detecting herbicide weed control spectrum in turfgrass. Plant Methods 18:94 CrossRefGoogle ScholarPubMed
Jones, EA, Austin, R, Dunne, JC, Leon, RG, Everman, WJ (2023) Discrimination between protoporphyrinogen oxidase–inhibiting herbicide-resistant and herbicide-susceptible redroot pigweed (Amaranthus retroflexus) with spectral reflectance. Weed Sci 71:198205 CrossRefGoogle Scholar
Kamath, R, Balachandra, M, Prabhu, S (2020) Crop and weed discrimination using Laws’ texture masks. Int J Agric Biol Eng 13:191197 Google Scholar
Le, VNT, Apopei, B, Alameh, K (2019) Effective plant discrimination based on the combination of local binary pattern operators and multiclass support vector machine methods. Inf Process Agric 6:116131 Google Scholar
Li, Y, Al-Sarayreh, M, Irie, K, Hackell, D, Bourdot, G, Reis, MM, Ghamkhar, K (2021) Identification of weeds based on hyperspectral imaging and machine learning. Front Plant Sci 11:611622 CrossRefGoogle ScholarPubMed
Liang, WC, Yang, YJ, Chao, CM (2019) Low-cost weed identification system using drones. Pages 260263 in Seventh International Symposium on Computing and Networking Workshops, 26–29 November 2019. Nagasaki, Japan: IEEE Google Scholar
Liu, B, Li, R, Li, H, You, G, Yan, S, Tong, Q (2019) Crop/weed discrimination using a field imaging spectrometer system. Sensors 19:5154 CrossRefGoogle ScholarPubMed
Longchamps, L, Panneton, B, Samson, G, Leroux, GD, Thériault, R (2010) Discrimination of corn, grasses and dicot weeds by their UV-induced fluorescence spectral signature. Precis Agric 11:181197 CrossRefGoogle Scholar
Louargant, M, Jones, G, Faroux, R, Paoli, JN, Maillot, T, Gée, C, Villette, S (2018) Unsupervised classification algorithm for early weed detection in row-crops by combining spatial and spectral information. Remote Sens 10:761779 CrossRefGoogle Scholar
Mahlein, AK, Rumpf, T, Welke, P, Dehne, HW, Plümer, L, Steiner, U, Oerke, EC (2013) Development of spectral indices for detecting and identifying plant diseases. Remote Sens Environ 128:2130 CrossRefGoogle Scholar
Mahlein, AK, Steiner, U, Dehne, HW, Oerke, EC (2010) Spectral signatures of sugar beet leaves for the detection and differentiation of diseases. Precis Agric 11:413431 CrossRefGoogle Scholar
Manevski, K, Manakos, I, Petropoulos, GP, Kalaitzidis, C (2011) Discrimination of common Mediterranean plant species using field spectroradiometry. Int J Appl Earth Obs Geoinf 13:922933 Google Scholar
Nagasubramanian, K, Jones, S, Singh, AK, Sarkar, S, Singh, A, Ganapathysubramanian, B (2019) Plant disease identification using explainable 3D deep learning on hyperspectral images. Plant Methods 15:110 CrossRefGoogle ScholarPubMed
Nguyen, C, Sagan, V, Maimaitiyiming, M, Maimaitijiang, M, Bhadra, S, Kwasniewski, MT (2021) Early detection of plant viral disease using hyperspectral imaging and deep learning. Sensors 21:742 CrossRefGoogle ScholarPubMed
Nidamanuri, RR (2020) Hyperspectral discrimination of tea plant varieties using machine learning, and spectral matching methods. Remote Sens Appl 19:100350 Google Scholar
Noble, SD, Brown, RB (2009) Plant species discrimination using spectral/spatial descriptive statistics. Pages 82–92 in Proceedings of the 1st International Workshop on Computer Image Analysis in Agriculture, August 27–28. Potsdam, GermanyGoogle Scholar
Osorio, K, Puerto, A, Pedraza, C, Jamaica, D, Rodríguez, L (2020) A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering 2:471488 CrossRefGoogle Scholar
Özlüoymak, ÖB (2020) Determination of plant height for crop and weed discrimination by using stereo vision system. Tekirdağ Ziraat Fakültesi Dergisi 17:97107 CrossRefGoogle Scholar
Pignatti, S, Casa, R, Harfouche, A, Huang, W, Palombo, A, Pascucci, S (2019) Maize crop and weeds species detection by using UAV VNIR hyperpectral data. Pages 72357238 in IEEE International Geoscience and Remote Sensing Symposium, 28 July 2019 – 02 August 2019. Yokohama, Japan: IEEE Google Scholar
Prospere, K, McLaren, K, Wilson, B (2014) Plant species discrimination in a tropical wetland using in situ hyperspectral data. Remote Sens 6:84948523 CrossRefGoogle Scholar
Putra, BTW (2020) New low-cost portable sensing system integrated with on-the-go fertilizer application system for plantation crops. Measurement 155:107562 CrossRefGoogle Scholar
Rasmussen, J, Azim, S, Nielsen, J (2021) Pre-harvest weed mapping of Cirsium arvense L. based on free satellite imagery—the importance of weed aggregation and image resolution. Eur J Agron 130:126373 CrossRefGoogle Scholar
Reiser, D, Vázquez-Arellano, M, Paraforos, DS, Garrido-Izard, M, Griepentrog, HW (2018) Iterative individual plant clustering in maize with assembled 2D LiDAR data. Comput Ind 99:4252 CrossRefGoogle Scholar
Revanasiddappa, B, Arvind, CS, Swamy, S (2020) Real-time early detection of weed plants in pulse crop field using drone with IoT. Int J Agric Technol 16:12271242 Google Scholar
Rock, G, Gerhards, M, Schlerf, M, Hecker, C, Udelhoven, T (2016) Plant species discrimination using emissive thermal infrared imaging spectroscopy. Int J Appl Earth Obs Geoinf 53:1626 Google Scholar
Sabzi, S, Abbaspour-Gilandeh, Y (2018) Using video processing to classify potato plant and three types of weed using hybrid of artificial neural network and particle swarm algorithm. Meas 126:2236 CrossRefGoogle Scholar
Sandoval, JD, Gor, SR, Ramallo, J, Sfer, A, Colombo, E, Vilaseca, M, Pujol, J (2012) Spectral signatures: a way to identify species and conditions of vegetables. Pages 308321 in Caivano, JL, Buera, MD, eds. Color in Food. Boca Raton, FL: CRC Press Google Scholar
Sayed Yones, M, Amin Aboelghar, M, Ali Khdery, G, Massoud Ali, A, Hussien Salem, N, Farag, E, Ahmed Mahmoud Mamon, S (2019) Spectral measurements for monitoring of sugar beet infestation and its relation with production. Asian J Agric Biol 7:386–95Google Scholar
Shafri, HZ, Anuar, MI, Seman, IA, Noor, NM (2011) Spectral discrimination of healthy and Ganoderma-infected oil palms from hyperspectral data. Int J Remote Sens 32:71117129 CrossRefGoogle Scholar
Shahbazi, N, Ashworth, MB, Callow, JN, Mian, A, Beckie, HJ, Speidel, S, Nicholls, E, Flower, KC (2021) Assessing the capability and potential of LiDAR for weed detection. Sensors 21:2328 CrossRefGoogle ScholarPubMed
Shanmugam, S, Assunção, E, Mesquita, R, Veiros, A, Gaspar, PD (2020) Automated weed detection systems: a review. KnE Eng 5:271284 Google Scholar
Shendryk, Y, Rossiter-Rachor, NA, Setterfield, SA, Levick, SR (2020) Leveraging high-resolution satellite imagery and gradient boosting for invasive weed mapping. IEEE J Sel Top Appl Earth Obs Remote Sens 13:44434450 CrossRefGoogle Scholar
Shirzadifar, A, Bajwa, S, Mireei, SA, Howatt, K, Nowatzki, J (2018) Weed species discrimination based on SIMCA analysis of plant canopy spectral data. Biosyst Eng 171:143154 CrossRefGoogle Scholar
Silalahi, DD, Reaño, CE, Lansigan, FP, Panopio, RG, Bantayan, NC (2016) Using genetic algorithm neural network on near infrared spectral data for ripeness grading of oil palm (Elaeis guineensis Jacq.) fresh fruit. Inf Process Agric 3:252261 Google Scholar
Slaven, MJ, Koch, M, Borger, CP (2023) Exploring the potential of electric weed control: a review. Weed Sci 71:403421 CrossRefGoogle Scholar
Sodjinou, SG, Mohammadi, V, Mahama, ATS, Gouton, P (2021) A deep semantic segmentation-based algorithm to segment crops and weeds in agronomic color images. Inf Process Agric 9:355364 Google Scholar
Strothmann, W, Ruckelshausen, A, Hertzberg, J, Scholz, C, Langsenkamp, F (2017) Plant classification with in-field-labeling for crop/weed discrimination using spectral features and 3d surface features from a multi-wavelength laser line profile system. Comput Electron Agric 134:7993 CrossRefGoogle Scholar
Su, J, Yi, D, Coombes, M, Liu, C, Zhai, X, McDonald-Maier, K, Chen, WH (2022) Spectral analysis and mapping of blackgrass weed by leveraging machine learning and UAV multispectral imagery. Comput Electron Agric 192:106621 CrossRefGoogle Scholar
Su, WH (2020) Advanced machine learning in point spectroscopy, RGB-and hyperspectral-imaging for automatic discriminations of crops and weeds: a review. Smart Cities 3:767792 CrossRefGoogle Scholar
Su, WH, Fennimore, SA, Slaughter, DC (2019) Fluorescence imaging for rapid monitoring of translocation behaviour of systemic markers in snap beans for automated crop/weed discrimination. Biosyst Eng 186:156167 CrossRefGoogle Scholar
Subeesh, A, Bhole, S, Singh, K, Chandel, NS, Rajwade, YA, Rao, KVR, Kumar, SP, Jat, D (2022) Deep convolutional neural network models for weed detection in polyhouse grown bell peppers. Artif Intell Agric 6:4754 Google Scholar
Sulaiman, N, Che’Ya, NN, Mohd-Roslim, MH, Juraimi, AS, Mohd-Noor, N, Fazlil-Ilahi, WF (2022) The application of Hyperspectral Remote Sensing Imagery (HRSI) for weed detection analysis in rice fields: a review. Appl Sci 12:2570 CrossRefGoogle Scholar
Sunil, GC, Koparan, C, Ahmed, MR, Zhang, Y, Howatt, K, Sun, X (2022) A study on deep learning algorithm performance on weed and crop species identification under different image background. Artif Intell Agric 6:242256 Google Scholar
Symonds, P, Paap, A, Alameh, K, Rowe, J, Miller, C (2015) A real-time plant discrimination system utilising discrete reflectance spectroscopy. Comput Electron Agric 117:5769 CrossRefGoogle Scholar
Talaviya, T, Shah, D, Patel, N, Yagnik, H, Shah, M (2020) Implementation of artificial intelligence in agriculture for optimisation of irrigation and application of pesticides and herbicides. Artif Intell Agric 4:5873 Google Scholar
Tao, T, Wei, X (2022) A hybrid CNN–SVM classifier for weed recognition in winter rape field. Plant Methods 18:29 CrossRefGoogle ScholarPubMed
Türker-Kaya, S, Huck, CW (2017) A review of mid-infrared and near-infrared imaging: principles, concepts and applications in plant tissue analysis. Molecules 22:168 CrossRefGoogle Scholar
Ullah, S, Schlerf, M, Skidmore, AK, Hecker, C (2012) Identifying plant species using mid-wave infrared (2.5–6 μm) and thermal infrared (8–14 μm) emissivity spectra. Remote Sens Environ 118:95102 CrossRefGoogle Scholar
Vaiphasa, C, Skidmore, AK, de Boer, WF, Vaiphasa, T (2007) A hyperspectral band selector for plant species discrimination. ISPRS J Photogramm Remote Sens 62:225235 CrossRefGoogle Scholar
Wang, J, Chen, G, Ju, J, Lin, T, Wang, R, Wang, Z (2023) Characterization and classification of urban weed species in northeast China using terrestrial hyperspectral images. Weed Sci 71:353368 CrossRefGoogle Scholar
Xi, QI, Li, YZ, Su, GY, Tian, HK, Zhang, S, Sun, ZY, Long, YA, Wan, FH, Qian, WQ (2020) MmNet: identifying Mikania micrantha Kunth in the wild via a deep Convolutional Neural Network. J Integr Agric 19:1292–300Google Scholar
Yang, W, Yang, C, Hao, Z, Xie, C, Li, M (2019) Diagnosis of plant cold damage based on hyperspectral imaging and convolutional neural network. IEEE Access 7:118239118248 CrossRefGoogle Scholar
Yu, J, Schumann, AW, Sharpe, SM, Li, X, Boyd, NS (2020) Detection of grassy weeds in bermudagrass with deep convolutional neural networks. Weed Sci 68:545552 CrossRefGoogle Scholar
Zarco-Tejada, PJ, Camino, C, Beck, PSA, Calderon, R, Hornero, A, Hernández-Clemente, R, Kattenborn, T, Montes-Borrego, M, Susca, L, Morelli, M, Gonzalez-Dugo, V (2018) Previsual symptoms of Xylella fastidiosa infection revealed in spectral plant-trait alterations. Nat Plants 4:432439 CrossRefGoogle ScholarPubMed
Figure 0

Figure 1. The measurement system and illumination setup: 1, light source; 2, spectroradiometer; 3, sample; 4, laptop computer.

Figure 1

Figure 2. A spectral reflectance of a bell pepper plant leaf (A) and its scalogram of continuous wavelet transform (CWT) colored as an RGB image (B).

Figure 2

Table 1. Technical details of the common classifiers used for discrimination of plants/weeds

Figure 3

Figure 3. Block diagram of the proposed algorithm for the separation of crops and weeds.

Figure 4

Figure 4. The spectral reflectance of plant leaves: (A) bell pepper, (B) Convolvulus spp., (C) Cyperus rotundus, (D) Plantago lanceolata, (E) Potentilla simplex, and (F) Rumex acetosa.

Figure 5

Figure 5. The average of preprocessed spectra of crops and weeds. The spectra were smoothed, normalized, and mean-centered.

Figure 6

Table 2. Comparison of validation and test accuracies for different classifiers

Figure 7

Figure 6. Diagram showing the SqueezNet training process and accuracy per iteration.

Figure 8

Table 3. The details of the training process of the network for each learning epoch

Figure 9

Figure 7. Diagram of loss function values per iteration during the SqueezNet training process.

Figure 10

Figure 8. Confusion matrix of the networks representing true and false classifications: (A) GoogLeNet and (B) SqueezNet.