Abstract
Precision agriculture or precision farming is a scientific management strategy based on the spatial and temporal variability of soil, crops, and the environment, and crop sensing is an effective technology to understand the variability. In the past decades, a number of crop sensors or instruments based on spectroscopy have been developed and applied to satisfy the requirements and solve detecting problems in the field. These instruments can be used in multiple types, such as handheld detection, vehicle-mounted diagnosis, and remote sensing by UAV or satellites. Typical sensors and specific applications are summarized to explain the application fundamental and potential of crop sensing. These spectral sensors include hyper-spectrometers, multiband sensors for vegetation indices, and imagery instruments using visible or extended spectral bands. Frontier research areas in sensor development are also introduced involving wireless sensor networks, integrated sensors for data fusion, and different methods for spectral imaging collection. In addition, applications of different sensors are reviewed including the recognition of crops and weeds, estimation of nutrients and growth status, and identification of disease and pests. A variable-rate fertilizer system controlled by crop sensors is also demonstrated to show how crop sensing technology could help precision management in the field. Crop sensing sensors and instruments will promote reliable predictions and operations in agriculture.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
8.1 Introduction
Precision agriculture (PA) or precision farming contributes to improving agronomic performance, saving resources, and protecting the environment. It is established as a management strategy that employs detailed, site-specific information to precisely manage production inputs based on variability to replace average inputs in the field. Besides 3S technology including the Global Navigation Satellite System (GNSS), Geographic Information Systems (GIS), and remote sensing (RS), many other technologies such as proximal imaging, spectroscopy, and wireless sensor network (WSN) are applied in PA (Jawad et al. 2017). They provide more efficient ways in crop management including real-time detection of crop growth, targeted analysis and decision-making of precision operation, and manual labor liberation by optimized tools.
In general, there are three main steps in precision crop management including soil and crop sensing, decision-making, and variable-rate application. One of the critical issues in precision agriculture is how to measure crop growth data noninvasively and efficiently. In the past decades, significant progress on optical instruments has been made in crop monitoring (Pallottino et al. 2019). A number of crop sensors and instruments have been developed and used to meet the requirements of PA and solve detecting problems in the field as shown in Fig. 8.1. They include RGB (red, green, blue) cameras, multispectral image sensors, hyper-spectrometers, unmanned aerial vehicles (UAVs), satellite remote sensors, thermographic imagers, and light detection and ranging (LiDAR). Most of them are developed based on the combination of spectroscopy, optical principles, and photodetectors to measure the reflected electromagnetic energy. In a typical way, data are represented with energy intensity by line plots and two-dimensional (2D) and three-dimensional (3D) images and even presented as a data cube.
In most cases, crop information is captured by instruments through line scanning or digital photography. Then the data are analyzed using various specialized software applications, such as spectroscopy analysis and digital image processing. Spectroscopy uses the interaction of electromagnetic waves with an object to perform an analysis related to crop nutrient or biomass. Digital imaging is a set of computational techniques for analyzing, enhancing, compressing, processing, and reconstructing crop images. Both methods are widely used in crop recognition and parameter estimation.
Researchers have developed some new instruments and extended their applications in many scenarios, including handheld detection, vehicle-mounted diagnosis, and remote sensing by UAVs or satellites, to build reliable prediction models of complex and uncertain phenomena in agriculture. In order to explain the application fundamental and potential of crop sensing instruments in precision agriculture, sensors and the specific models from spectral and image sensing technologies are examined. Applications of crop sensing involve the recognition of crops and weeds, estimation of nutrient and growth status, identification of disease and pests, and detection of special crop fruits.
8.2 Spectroscopy-Based Sensing Instruments for Crop Monitoring
8.2.1 Foundation of Spectral Sensing and Vegetation Indices in Crop Sensing
According to the range of electromagnetic radiation at specific nanometer (nm) wavelength, crop sensing is generally referred to as ultraviolet (UV, 200–400 nm), visible (Vis, 400–760 nm), near-infrared and shortwave infrared (NIR and SWIR, 760–2500 nm) (Toth and Joźkow 2016). The green plant typically displays low reflectance in the visible region, especially in the red band close to 650 nm due to strong absorbance by photosynthetic and accessory plant pigments. By contrast, the reflectance is usually high from the red-edge (680–760 nm) to the NIR (780–2500 nm) region because there is very little absorbance by subcellular particles or pigments and considerable scattering at mesophyll cell wall interfaces.
Since the changes in leaf pigments and biochemical components caused by nutrient stress or bio-infringement can influence the spectral characteristics of leaves, the spectral analysis can be used to monitor growing crops (Narvaez et al. 2017). Zhang et al. (2019a) reviewed that the spectral features used in plant monitoring was particularly affected by disease and pests. As shown in Table 8.1, spectral features of infected or damaged plants are highlighted including VIS-NIR spectral reflectance, fluorescence, and thermal features. Among those spectral features, band reflectance is the simplest form and can be transformed in different ways, such as spectral derivative, continuous removal transformation, and continuous wavelet transformation.
Spectral characteristics of vegetation can be analyzed based on sensitive reflectance and the vegetation index (VI). Sensitive wavelengths related to crop parameters are selected from the hyper-spectra to evaluate the vegetation vigor. A VI is a spectral transformation of two or more wavebands designed to enhance the contribution of vegetation properties and allows reliable spatial and temporal intercomparisons of terrestrial photosynthetic activity and canopy structural variations. Some VIs measured by specific bands are listed in Table 8.2, in which Ri is the reflectance at i nm or i band such as green, red, red edge, or NIR. Among these VIs, the Normalized Difference Vegetation Index (NDVI) is the most commonly used in crop monitoring. Large amounts of literature indicate that quantization parameters of spectra and vegetation indices are common methods in vegetation recognition, crop classification, and biomass estimation. Therefore, some sensors have been developed to measure VIs based on spectroscopy due to huge potentials of field applications.
8.2.2 Spectral Sensing in Crop Monitoring
Spectral instruments with optical sensors are the fundamental tools to assess vegetation status. Three kinds of spectral instruments have been used including continuous spectrometers, vegetation index sensors, and imaging spectrometers. In general, spectrometers are used to measure continuous spectral reflectance of light over a specific portion of the electromagnetic spectrum, and vegetation index sensors measured by dual or multispectral bands. There are many commercially available products of portable sensors for crop monitoring. They are generally defined as a passive type or an active type according to lighting strategies during the measurements. For example, the Soil Plant Analysis Development (SPAD) meter is a compact device with active lighting and transmittance measurement to determine the amount of chlorophyll in plant leaves at 650 and 940 nm. The GreenSeeker (Trimble Agriculture, Sunnyvale, CA, USA) gets NDVI with the active lighting module. Some applications of hyperspectral sensing and vegetation index sensing in crop monitoring are listed in Table 8.3, which include detections of chlorophyll content, nitrogen content, and sugar content and estimations of growth stages and yields, even weed identification.
8.2.2.1 Hyper-Spectrometers for Crop Sensing
Hyper-spectrometers are the most frequently used instruments especially during theoretical or mechanism analysis in crop sensing. Most of the instruments used in chemical detection are designed based on UV spectroscopy, which work under the principle of Beer Lambert’s law. Zhang et al. (2015) used a UV-2450 spectrograph to measure the visible and NIR spectral reflectance of apple leaf samples within the 300–900 nm band, and such spectral information of apple tree leaves in different phenological phases could be used to predict fruit sugar content. According to the result of the two-dimensional correlation spectroscopic analysis on apple leaf reflectance with fruit sugar content as perturbation, it was observed that the autocorrelation peaks all appeared at the 530–570 nm and 700–720 nm wavebands in the synchronization spectrogram. The contribution proportion to fruit sugar content in different growth periods was investigated and then the support vector machine (SVM) model was established. The determination coefficient of the calibration model (Rc2) of the SVM model reached 0.89, and the determination coefficient of validation (Rv2) reached 0.88.
Compared with laboratory instruments, portable sensors are more flexible in the field. The devices could be selected by spectral range, resolution, usage requirements, and so on. For example, ASD FieldSpecHHFootnote 1 is a 512-element photodiode array spectroradiometer with a 325–1075 nm wavelength range. It uses a fixed concave holographic reflective grating that disperses the light onto a fixed photodiode array that has 512 individual detection points or “elements” in a line. Associated with each of these elements is a distinct signal whose magnitude is determined by the total integrated amount of light energy falling on that element. Then, each element is assigned to a position within 512 points. In this way, the analog signal is converted into digital signal. The instruments could be set to view traceable wavelength references such as emission source, reflectance standards, and the output of a triple monochromator. The output results are data points with known element-position and wavelength-channel coordinates.
Many of the current studies on crop monitoring involve portable spectrometers. The operation flow generally involves the control parameter setting, storage directory setting, dark noise measuring, reference calibration, sample detection, spectrum calculation, and display. The measured data are used to analyze and establish a specific model for crop monitoring purposes. Liu et al. (2018a, b) measured the spectral reflectance of maize canopy by using ASD FieldSpecHH to estimate the chlorophyll content. The data were processed following wavelet denoising and multivariate scatter correction (MSC) to reduce the noise influence. Then three spectral ranges were extracted by interval partial least squares (IPLS), including 525–549 nm, 675–749 nm, and 850–874 nm. The chlorophyll content estimation model was developed by using support vector regression (SVR). The calibration Rc2 of the model was 0.831, the RMSEC was 1.3852 mg/L, the validation Rv2 was 0.809, and the RMSEP was 0.8664 mg/L. Using the same spectrometer, Sun et al. (2019a) explored the optimizing spectral features to identify the growth stages of potato plants. In general, the canopy spectral reflectance varied with the growth stages in the bands of 400–500 nm, 530–640 nm, 740–880 nm, and 910–960 nm. The classification accuracies of SVM models were 100% in the training set and 94.59% in the testing set, respectively.
8.2.2.2 Portable Sensors Used in Crop Monitoring
According to the specific features and VIs used in crop monitoring, some sensors are designed as portable with only several sensitive bands to reduce redundant spectra. These specific sensors are generally developed based on the red and NIR bands. Besides red and NIR bands, a red-edge band located in the range of 700–760 nm is also included in the instruments to increase the variables in detecting models. An instrument can be designed to measure the transmitted or reflected light from leaves and crop canopy. The light source in the measurement can be natural light (sunlight) or artificial light source (lamp) defined as a passive or active lighting. The instrument with active lighting is more robust in field application to improve the performances under the limitation of weather or time windows. A few portable (handheld) instruments could be used to evaluate the content of chlorophyll or nitrogen, LAI, and yield using the calculated VIs. Farmers can use them in precision agriculture according to the application cases, leaf, or canopy measurement.
8.2.2.2.1 Portable Sensors for Leaf Measurement
Portable or handheld instruments for leaf measurement have advantages of compact size and lightweight. Most of them are designed based on the transmittance with the active light source. One of the widely used leaf chlorophyll meter is probably the Soil and Plant Analyzer Development (SPAD) chlorophyll meter, such as SPAD-502 Plus (Konica Minolta Inc., Japan).
Uddling et al. (2007) reported that the readings from the SPAD-502 Plus could not only provide the measurement of chlorophyll content, but also provide the information for estimating nitrogen status as well as photosynthetic pigment content. Schepers et al. (1992) compared the corn leaf disk N concentrations and SPAD 502 chlorophyll meter readings from N rate studies at the silking stage for a variety of hybrids. Data indicated that chlorophyll meter readings correlated well with leaf N concentrations for a given hybrid and location. Netto et al. (2005) established a correlation between the photosynthetic pigment content extracted in dimethylsulfoxide, the total nitrogen content, and the chlorophyll fluorescence variables with the SPAD-502 readings in Coffea canephora Pierre leaves. If the SPAD-502 readings were lower than 40, it showed impairment in the photosynthetic process. In the study, total N concentration increased linearly with SPAD-502 readings. Meanwhile, the relationship between the values obtained by the SPAD-502 and the chlorophyll fluorescence variables (F0, Fm, and Fv/Fm) proved that the maximum quantum efficiency of the photosystem II, indicated by the Fv/Fm ratio, started to fall at around 40.
The measured values by SPAD meters have also been used in the fertilization guiding. Zhao et al. (2007) proposed a study on the relationship between SPAD chlorophyll meter readings and nitrogen content in leaves in order to determine the amount of nitrogen fertilization. Field experiments were conducted in three wheat growth duration stages from 2003 to 2006. Grain yields and soil NO3-N contents were measured in all plots. The results indicated that the fertilizer application guided by the meter values reduced the spatial variability of wheat yield and had benefits of low soil residual NO3-N content and NO3-N leaching potential.
Gholizadeh et al. (2017) focused on the relationship between SPAD chlorophyll meter readings and N content in leaves during different growth stages. The research introduced the most suitable stage for the assessment of crop N and prediction of rice yield. Results implied that there was a better relationship between rice leaf N content (R2 = 0.93) and yield (R2 = 0.81), with SPAD readings at the panicle formation stage. Therefore, the SPAD-based evaluation of N status and prediction of rice yield is more reliable on this stage rather than at the booting stage.
Although SPAD readings have been widely used in the measurement of chlorophyll content, Xiong et al. (2015) indicated the relationship between chlorophyll content and leaf N content per leaf area, and the relationship between SPAD readings and leaf N content per leaf area varied widely among the species groups. A significant impact of light-dependent chloroplast movement on SPAD readings was observed under low leaf N supplementation in both rice and soybean but not under high N supplementation. Furthermore, the allocation of leaf N to chlorophyll was strongly influenced by short-term changes in growth light. It demonstrates that the relationship between SPAD readings and leaf N content per leaf area is profoundly affected by environmental factors and leaf features of crop species, which should be accounted for when using a chlorophyll meter to guide N management in agricultural systems.
8.2.2.2.2 Portable Sensors for Canopy Measurement
Instruments for canopy monitoring are generally designed to measure the reflected light related to typical VIs. Portable instruments, such as GreenSeeker, Crop Circle, and N-Sensor, are commonly used to get the NDVI in the field. For on-the-go applications, these sensors can also be mounted to vehicles to remotely sense plants while driving through a field.
According to the concept of the active crop canopy monitoring, an instrument emits a brief burst of red and infrared light and then measures the amount of each type of light that is reflected back from the plant. GreenSeeker sensors (Trimble Navigation Limited, Sunnyvale, CA, USA) are designed based on modulated red (650–670 nm) and NIR (755–785 nm) LEDs (light-emitting diode). Crop Circle devices (Holland Scientific Inc., Lincoln, Nebraska, USA) are equipped with multispectral active sensors. The Crop Circle ACS-430 incorporates three optical measure channels, so that the sensor simultaneously measures crop/soil reflectance at 670, 730, and 780 nm. Moreover, the Crop Circle ACS-470 has six bands (450, 550, 650, 670, 730, 760 nm) and three of these bands can be used at one time to measure the radiative transfer and biophysical characteristics of plant canopies. Yara N-Sensor (Yara International ASA, Germany) is different from the active optical sensors mentioned above. It has a xenon flashlamp, which provides high-intensity multispectral light, so that it can measure and record the crop light reflectance in a waveband between 450 and 900 nm (Munoz-Huerta et al. 2013).
Several studies were conducted to detect crops based on portable sensors mentioned above. Cao et al. (2012) found that GreenSeeker-NDVI was exponentially related to N uptake in winter wheat, whereas the correlation between N uptake and RVI was linear. Zhang et al. (2019b) intended to expand the applicability of GreenSeeker in monitoring the growth status and predicting the grain yield of winter wheat (Triticum aestivum L.). Four field experiments with multiple wheat cultivars and N treatments were conducted during 2013–2015 to obtain NDVI and RVI synchronized with four agronomic parameters: LAI, leaf dry matter (LDM), leaf nitrogen concentration (LNC), and leaf nitrogen accumulation (LNA). Duration models indicated that NDVI and RVI explained 80%, 68–70%, 10–12%, and 67–73% of the variability in LAI, LDM, LNC, and LNA, respectively. Considering the variation among different wheat cultivars, the newly normalized VIs rNDVI (NDVI vs. the NDVI for the highest N rate) and rRVI (RVI vs. the RVI for the highest N rate) were calculated to predict the relative grain yield (RY, the yield vs. the yield for the highest N rate). rNDVI and rRVI explained 77–85% of the variability in RY.
In order to determine which VIs calculated from the Crop Circle sensor can perform the best estimation of rice N status, Cao et al. (2013) compared six VIs based on the green (550 ± 20 nm), red-edge (730 ± 10 nm), and NIR (>760 nm) bands. The results indicated that using the Normalized Difference Red Edge (NDRE) to predict plant N uptake had the highest coefficient of determination (R2, 0.76) and the lowest root mean square error (RMSE, 17.00 kg N/ha). The second best-performing vegetation index was the Red-Edge Chlorophyll Index (CIRE), which performed similarly to NDRE. Crop Circle ACS-210 and ACS-430 (red at 630 nm, red edge at 730 nm, and NIR at 780 nm) were compared and different NDVI values were analyzed in each individual waveband (Taskos et al. 2015). The results demonstrated that ACS-430 and red-edge-based indices were more strongly correlated with leaf chlorophyll of vineyards.
Regarding the Yara N-Sensor (Yara International ASA, Germany), Singh et al. (2015) investigated the tractor-mounted N-Sensor to predict nitrogen (N) content for wheat crop under different nitrogen levels. It was observed that there was a strong correlation among sensor attributes (sensor value and sensor NDVI) and different N-levels. The Yara N-Sensor/FieldScan (Yara International ASA, Germany) was used to assess the status of N in spring wheat and corn (Zea mays L.) at specific growth stages (Tremblay et al. 2009). It was found that the Yara N-Sensor/FieldScan should be used before growth stage V5 in corn during the season if NDVI was used to derive crop N requirements. Yara N-Sensor/FieldScan can also record spectral information from wavebands other than red and NIR, and more VIs can be derived that might relate better to the nitrogen status than NDVI.
Besides the instruments introduced above, there are similar systems such as OptRx Crop meter (Holland Scientific, USA), CropSpec sensor (Topcon Positioning Systems, USA), and CCM-200 and CCM 300 (Edaphic Scientific, Australia). They are also widely used in nitrogen and chlorophyll measurements (Serrano et al. 2016; Sharabian et al. 2013). Published reports indicate that each sensor has its own sensitivity characteristics, and the wavelengths around 550, 650, 766, and 850 nm are mostly selected according to different applications (Tremblay et al. 2009; Cao et al. 2017; Taskos et al. 2015). Meanwhile, the algorithms should be proposed to establish estimation models, so that the modeling results could indicate the operation in the field management.
8.2.3 Development of Spectroscopy-Based Systems for Crop Detection
The current trend in crop sensing is to integrate compact sensors and detecting models. In this sense, certain studies have been conducted to develop new systems to provide support in field management.
8.2.3.1 Development of Hyperspectral Sensors for Crop Monitoring
In order to predict the nutrient content of winter wheat nondestructively in the field, an integrated system was developed based on an STS-VIS sensor (Cheng et al. 2017). The STS-VIS sensor from Ocean Optics Inc., USA, is a compact sensor for portable application. It is a grating-based device with an advanced CMOS (complementary metal-oxide-semiconductor) 1024-element detector array to measure wavelengths in 350–850 nm. The USB output makes secondary development possible to satisfy online detection, typically by the software integration of established models. As shown in Fig. 8.2, the hardware of the integrated spectrometer consists of three parts of the optical system, the data storage module, and the controller. The optical sensor with a fiber is used to measure the reflected light from the leaf or canopy of the field crop. The controller could be connected to the sensor through USB2.0 or a wireless network. The supporting software installed on the PC or mobile controller helps to control the signal communication and processing. The setting parameters include the integration time, sampling frequency, and average number due to the effects of the ambient light intensity and the sampling requirements.
A software program was also developed to collect the spectral reflectance of winter wheat canopy in 350–820 nm. The calibration experiment was carried out to test the performance of the sensor by a gray calibration board with four different gray levels. The correlation coefficient between the sensor and ASD (Field Spec HandHeld2) showed that the average correlation coefficient was 0.94. Eight wavelengths, including 514, 527, 562, 572, 605, 705, 719, and 795 nm, were selected to detect the chlorophyll content using the random frog (RF) algorithm after spectral curve smoothing. The determination coefficient of the partial least squares (PLS) regression model was 0.69.
8.2.3.2 WSN-Based Sensors for Crop Monitoring
With the development of wireless sensor networks (WSNs), a novel system which contained one control unit and several optical sensor nodes for crop growth detection was developed by China Agricultural University (Zhong et al. 2014). Sensors, organized by ZigBee WSN, were designed to collect, amplify, and transmit the optical signals. A CS350 (Cilico Microelectronics Corp., Ltd., Xi’an, China) type of PDA (personal digital assistant) was selected as the coordinator of the whole wireless network to receive, display, and store all the data sent from different sensor nodes. Since wireless communication was applied, the PDA could be easily used, installed in the cab of the tractor, or hand-held by the operator.
Each sensor node was designed with four optical channels at the wavebands of 550, 650, 766, and 850 nm, respectively. Since the detection system used sunlight as the light source, besides the reflected light from crop canopy, the sunlight intensity was also measured as a reference as shown in Fig. 8.3. A full-function sensor node had to contain eight optical channels, the upward four for the sunlight measurement and the downward four for the reflected light measurement. A silicon photodiode was used to convert the light signal to current signal in each optical channel. A 4:1 time-sharing analog multiplex chip was applied to share the amplification unit and an OPA333 amplifier was chosen which had the properties of high precision, low quiescent current, and low power consumption. The weak signals were then amplified and transformed to voltage signals and subsequently read through A/D convertors in the microcontroller unit (MCU), which was a JN5139 wireless module (Jennic Co, UK). The measured data were wirelessly transmitted to the coordinator via an antenna.
Therefore, once started, the sensor was initialized and the data were collected automatically with a certain sampling frequency. By setting the address of analog switch, the data of each channel were repeated for ten times and then averaged. Sensors had different identification numbers, and the sampling frequency was adjustable according to different requirements.
In the field experiments, the optical sensors measured the spectral reflectance of the crop canopy with four channels at 550, 650, 766, and 850 nm separately. The transmission quality of the sensor nodes was evaluated at distances of 20, 40, 60, 80, and 100 m and the signals could be transmitted precisely without packet loss in all tests. Calibration experiments showed that the accuracy of the optical components was high enough for application. The results of the stationary field experiments showed that the detection system was capable of monitoring the spectral characteristics of the crop canopy. The correlation between chlorophyll content and NDVI was at an acceptable level, with R2 of 0.681–0.718. The system provides a support for crop growth detection and a theoretical basis for further research on chlorophyll content prediction in the field.
8.2.3.3 An Integrated Sensor Based on Spectroscopy and Imagery
Furthermore, in order to monitor crop information more efficiently, a multi-fusion sensor was developed based on the combination of spectroscopy and imagery technology, as shown in Fig. 8.4 (Long 2020). The sensor was designed to collect the spectral reflectance and images of the crop canopy. It consists of three parts including sensors, a data processing unit, and a data transmission port. The spectral reflectance collected by an AS7263 sensor (ams AG, Premstaetten, Austria) involved six bands in the red and NIR ranges (610, 680, 730, 760, 810, and 860 nm), each of which had 20 nm of full-width half-max detection. The RGB image was captured to estimate the canopy coverage to help determine the location during field measurement. The data could be sent to a mobile phone remotely through a Wi-Fi module.
The sensor application experiments were performed. The fusion data of spectral reflectance and images from the sensor were used to analyze the growth status of field corn with different levels of fertilizer. The adaptive boosting algorithms were used to model the chlorophyll content. The determination coefficient of the model was 0.859, which was higher than that just based on spectral data (0.829). The fusion of spectral reflectance and image data could improve the prediction accuracy of crop chlorophyll content. It provides a new tool for crop monitoring in the field.
8.3 Image Sensing for Crop Detection
8.3.1 Foundation of Crop Imaging and Feature Extraction
Optical imaging is one of the noninvasive methods for crop sensing. Similar to spectrometers, optical imaging uses the special properties of light and electromagnetic waves to obtain detailed images of leaves and plants, as well as canopy and even ecosystems. In a typical way, the data are represented with energy intensity in a line plot or 2D images. Recently, image sensing has resulted in many developments in agricultural information acquisition. A variety of imaging instruments are available such as monochrome and color digital cameras (RGB), depth and time-of-flight (ToF) cameras, multispectral and hyperspectral cameras, thermography, fluorescence sensors, and others (Yang et al. 2017; Li et al. 2014a, b).
New data sources and processing methods of 2D and 3D images and spectral data cubes significantly boom the research on crop recognition, plant positioning, and phenotype measurement. Lots of features can be extracted from images as shown in Table 8.4, including color features, texture presentation, shape and spatial description, Vis-NIR spectral features, fluorescence, and thermal parameters (Mavridou et al. 2019; Ali et al. 2017).
In the last two decades, extensive research has been reported for image feature extraction and objective analysis. High-level image visuals are represented in the form of feature vectors that consist of numerical values. Research shows that there is a significant gap between image feature representation and human visual understanding (Latif et al. 2019). Thus, the feature selection in imaging systems is dependent on the requirements of crop monitoring; meanwhile, feature representation is another task in research.
Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large amount of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping (Waldchen and Mader 2018). ML approaches can be deployed in identification, classification, and prediction, such as SVM, neural networks (NNs), kernel methods, and instance-based approaches (Singh et al. 2016). Recently, deep learning (DL), a subset of ML approaches, has emerged as a versatile tool to assimilate large amounts of heterogeneous data and provide reliable predictions of complex and uncertain phenomena (Liu et al. 2017). These tools are increasingly being used in extracting crop features and identifying symptoms of crop growth status (Singh et al. 2018).
8.3.2 Imaging Technologies Used in Crop Detection
Imaging technologies play an important role in crop sensing. The great majority of the sensors are designed based on either solid-state technology, such as CCD (charge-coupled device) and CMOS (complementary metal-oxide-semiconductor) chips used in optical imagers, or avalanche photodiodes, like InGaAs (indium gallium arsenide) and single-photon avalanche diode (Toth and Joźkow 2016). An appropriate equipment should be examined in order to satisfy the needs of each application. In general, the most important factors that need to be considered are the sensor resolution, frame rate, and price (Pajares et al. 2016).
Considering the diverse cameras that are available in the market, several images used are listed in Table 8.5. Imaging technologies used in near-ground crop detection can be divided into four types which are digital color imaging to capture RGB images, 3D imaging to measure depths or spatial distribution, and spectral and thermal imaging. A color image is simple and affordable, so that RGB images are extensively used in crop sensing tasks of recognizing weeds, measuring plants, and detecting diseases and pests in the field (Garcia et al. 2017; Yang et al. 2014; Jiang et al. 2018; Ferreira et al. 2019; Zong et al. 2019; Knoll et al. 2019).
Although a stereovision system could measure 3D data, the imaging methods by ToF are popular due to the robust environmental influences, such as LiDAR and photonic mixer devices (PMD) (Knoll et al. 2016a). A typical LiDAR sensor emits pulsed light waves into the surrounding environment. These pulses bounce off surrounding objects and return to the sensor and then the time for each pulse to return to the sensor is measured. The sensor uses the time to calculate the distance between the sensor and the object. Repeating this process millions of times per second creates a precise, real-time 3D map of the environment. The LiDAR sensors are used in phenotype measurement such as height and biomass (Tilly et al. 2015). Moreover, some low-cost 3D cameras are also applied in crop sensing such as Kinect (Microsoft, USA) and Real Sense (Intel., USA). They provide flexible tools in weed identification and fruit recognition (Sa et al. 2016; Kang and Chen 2020).
Imaging spectrometers collect images as well as spectra from the observed crop. Nowadays, a wide range of imaging spectrometers have been used on different platforms including stationary or handheld near-ground platforms and unmanned aerial vehicle (UAV) platforms. Imaging spectral instruments have been widely used in crop detecting with crop classification, disease identification, nutrient estimation (chlorophyll, water, nitrogen content), and so on (Zhu et al. 2019; Yue et al. 2018; Huang et al. 2017; Liu et al. 2018a, b; Zheng et al. 2018). In addition, thermal sensors are also used in drought estimation because of close relationships among the temperature, water stress, and environment (Maimaitijiang et al. 2020).
8.3.3 Development of Imaging Systems for Crop Detection
The results of previous research studies have provided basic principles for the development of optical sensing to acquire the spectral information in the field. Spectroscopy analysis and image processing are applied as rapid, convenient, and nondestructive techniques for crop growth monitoring. The Research Center for Precision Agriculture at China Agricultural University (CAU) has developed three kinds of multispectral imagery systems for crop monitoring (Wu et al. 2015; Sun et al. 2019b; Liu et al. 2020). In general, each system includes a multispectral camera device and controlling software. The multispectral camera is designed with the capability to measure multispectral images of crop canopy in three visible bands (red [R], green [G], and blue [B]) and a NIR band. The software is developed to control the camera system. Furthermore, the estimating models of crop parameters should be embedded in the system. This way, it could provide an online device and method for crop sensing.
8.3.3.1 A Two-CCD-Based Imaging System for Crop Measurement
A two-CCD-based imaging system was designed for crop measurement, which included a multispectral image acquisition device, a communication protocol converter, and a controlling platform (Wu et al. 2015). A multispectral two-channel CCD camera (JAI Ltd., Denmark) was used, which included a splitter prism with two reflecting mirrors to split input light into visible and NIR bands. Two CCD sensors could obtain four images in three visible bands (400–700 nm, R, G, and B) and one NIR band (760–1000 nm) at the same time. The camera link communication protocol standard was adapted to output RGB and NIR images with 1024(h) × 768(v) active pixels per channel. The communication between the camera and computer was conducted by a QuadMVCL2GE converter (Beijing Microview Science and Technology Co., Ltd., China) to convert the output image from the camera link into the GigE Vision standard. The highest output bandwidth was 960 Mbps. In the research, a panel industrial control computer (PPC-3708, Beijing Weidatong Co., China) was used as the system platform. The main functions included a multispectral camera control module, an image acquisition module, and a multispectral image processing module. When the system was connected, it could work following image acquisition, data conversion, and image display and storage. The multispectral image could be displayed and stored in RAW, BMP, and JPG format.
An image processing model was developed with three main functions: image enhancement, image segmentation, and parameter calculation (Sun et al. 2013). The developed system was applied in the chlorophyll content estimation of tomato. Multispectral images were collected and the SPAD values of tomato leaves were measured. More than 80 pairs of RGB and NIR images were acquired in the experiment. They were first processed by the median filtering algorithm to eliminate the noise and then segmented from the background. Figure 8.5a, b shows a pair of RGB and NIR images, and the segmented results are shown in Fig. 8.5c, d, respectively. The average gray values of each image were calculated to get the VIs of tomato canopy. The correlation analysis results indicated that the highest correlation coefficient was 0.7514 between RVI and SPAD values.
8.3.3.2 A Portable Binocular Sensor for Crop Monitoring
The NDVI calculated based on spectral reflectance is proved as one of the important parameters to estimate crop growth parameters quickly and nondestructively. Thus, the measurement of the NDVI distribution is one of important research directions for sensor development (Sun et al. 2019b). Unlike the two-CCD-based imaging system which could acquire the RGB and NIR images synchronously, some low-cost binocular vision systems could also be used in the collection of RGB and NIR images. The biggest challenge in using these kinds of binocular vision systems is image matching, so that the NDVI distribution and dynamics of crops could be monitored with high accuracy.
In order to develop a portable multispectral imaging system for crop monitoring, an FM830-5 M device (Shanghai Percipio Information Technology Co., Ltd., China), which had an RGB camera and two NIR cameras, was used to acquire RGB and NIR images of corn. The RGB and one of the NIR cameras were used to develop a binocular sensor for crop monitoring. Images of RGB and NIR were processed following preprocessing, image matching, segmentation, and image reflectance correction. The flowchart is shown in Fig. 8.6.
The acquired images were calibrated and preprocessed. Firstly, the RGB image was preprocessed. The edge and texture of the RGB image were enhanced by Laplace transform. The light saturation removal (LSR) algorithm was used to improve the image quality. Secondly, the median filter was used to eliminate the salt and pepper noise of images.
In order to compare the performances of different image matching methods, 51 maize plants were collected synchronously by the binocular vision system at 90°, 54°, and 35°, respectively. Three algorithms, namely, SURF (Speeded-Up Robust Features), SIFT (Scale-Invariant Feature Transform), and ORB (Oriented Brief), were applied and discussed for RGB-NIR image matching. The optimal matching method was SURF, which was determined by matching time, PSNR (peak signal to noise ratio), MI (mutual information), and SSIM (structural similarity index).
The crops were segmented from the background by using the ExG (Extra Green) algorithm and maximum interclass variance algorithm (OTSU). The R, G, B, and NIR components of the segmented RGB images were extracted. Then, the NDVI of each pixel in the image was calculated, and the spatial distribution map of the crop VI was drawn. The SPAD values at pixel level were calculated. The regression model of SPAD values and NDVI showed that the determination coefficient was 0.619. The demonstration of the sensor application and results are shown in Fig. 8.7.
8.3.3.3 A Portable Multispectral Sensor for Crop Measurement
A 25-wavelength spectral imaging sensor (mode: XIMEAI-5 × 5-CMOS, Shanghai Branch of IMEC Microelectronics Co., Ltd., China) was used to develop a multispectral system for crop measurement, as shown in Fig. 8.8a (Liu et al. 2020). The filter of this sensor was processed on the wafer of a commercial application CMOS image capture chip that has a mosaic layout. There was a specific spectral filter on each pixel, and 25 wavelengths were placed on the COMSIS-CMV2000 sensor with two million pixels. This sensor was able to obtain spectral information of the following 25 wavelengths: 666, 681, 706, 720, 732, 746, 759, 772, 784, 796, 816, 827, 837, 849, 859, 869, 887, 888, 902, 910, 920, 926, 935, 940, and 945 nm. The sensor had a field of view (FOV) of 50°. The image size of each wavelength was 409 pixels × 217 pixels, and the grayscale resolution was 10 bits.
In order to realize the real-time detection of SPAD values of potato plants in the field, a control software program was developed based on the Qt Creator 4.9.1 platform under a Windows operating environment. The user interface shown in Fig. 8.8b was designed based on the Qt Widgets application. The image processing functions were realized by calling the OpenCV libraries. The main functions of the software included the following: spectral image collection, exposure time adjustment, spectral image correction, SPAD value pseudo-color expression, SPAD value statistics, and image saving.
The spectral sensor and control software comprised the SPAD value real-time detection system. The reflectance of potato plants was extracted by the segmented mask images. The partial least squares (PLS) regression was employed to establish the SPAD value detection model based on sensitive variables selected using the uninformative variable elimination (UVE) algorithm. So the visualization distribution map of SPAD values was drawn by pseudo-color processing technology.
8.4 Remote Sensing Platforms for Crop Monitoring
8.4.1 Remote Sensing Instruments Used in Crop Monitoring
Unlike spectral sensors introduced above, remote sensing spectrometers usually operate in Earth observation, capturing images as well as spectra from the observed materials. Images in wavebands make it possible to locate and extract plants from the background by image processing and derive numerous VIs. Great efforts have been made over the past decades to produce high-quality data in remote sensing by developing a wide range of imaging spectrometers placed on aerial/satellite platforms (Paoletti et al. 2019). Compared with near-ground platforms such as UAVs and stationary or handheld near-ground devices, which focus on specific fields or plants in small areas (Wang et al. 2019; Han et al. 2019), aerial and satellite remote sensing related to Earth observation is suitable for large farmland and ecosystem monitoring.
Efforts have been made over the past decades to produce high-quality data. These instruments could be classified into multispectral or hyperspectral devices according to the numbers of bands. A multispectral image contains from several to about a dozen bands, while a hyperspectral image (HSI) contains hundreds to thousands of contiguous wavelengths (Mishra et al. 2017). Several systems, shown in Table 8.6, are mostly used in aerial and satellite remote sensing. Similar to the spectral images mentioned before, the features extracted from remote sensing data include color features, texture presentation, shape and spatial description, and Vis-NIR spectral features.
8.4.2 Application of Multispectral Remote Sensing
Traditional satellite sensors such as SPOT and Landsat have long been used in crop sensing. The SPOT Vegetation sensor was carried aboard SPOT 4 and 5 which were launched in 1998 and 2002, respectively. It had the capability of imaging the entire Earth each day with IFOV (1.15 km) (https://eos.com/landsat-5-tm/). SPOT Vegetation collected data in four spectral bands in 0.43–0.47 μm, 0.61–0.68 μm, 0.78–0.89 μm, and 1.58–1.75 μm (Cayrol et al. 2000).
Landsat Thematic Mapper (TM) was a multispectral scanning radiometer that was carried on board Landsat 4 and 5. The TM sensors had provided nearly continuous coverage from July 1982 to June 2013. A TM scene had an instantaneous field of view (IFOV) of 30 m × 30 m in bands of visible (0.45–0.52 μm, 0.52–0.60 μm, 0.63–0.69 μm), NIR (0.76–0.90 μm), and SWIR (2.08–2.35 μm), while the band of 10.41–12.5 μm has an IFOV of 120 m × 120 m on the ground. The Landsat Enhanced Thematic Mapper Plus (ETM+) was introduced with Landsat 7 (https://eos.com/landsat-7/) and was built by Raytheon SBRS (Santa Barbara Remote Sensing), Goleta, CA. Except the visible and NIR bands of TM data, ETM also scans the bands of SWIR (1.57–1.75 μm, 2.09–2.35 μm), thermal infrared (10.40–12.50 μm), and a panchromatic (PAN) (0.52–0.90 μm).
The Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) are instruments onboard the Landsat 8 satellite (https://eos.com/landsat-8/), in which OLI, built by the Ball Aerospace & Technologies Corporation, measures in the visible, NIR, and SWIR infrared portions of the spectrum. Therefore, Landsat 8 Instruments have nine spectral bands at 30-m spatial resolution including a PAN band: visible (0.43–0.45 μm, 0.450–0.51 μm, 0.53–0.59 μm), red (0.64–0.67 μm), NIR (0.85–0.88 μm), SWIR (1.57–1.65 μm, 2.11–2.29 μm), panchromatic (PAN) (0.50–0.68 μm), and cirrus (1.36–1.38 μm). It also has two thermal infrared sensors with bands of 10.6–11.19 μm and 11.5–12.51 μm at 100-m spatial resolution.
Using satellite remote sensing to understand maize yield gaps in the North China Plain with Quzhou County as an example, Zhao et al. (2015) used Landsat 5 TM, Landsat 7 ETM+, and SPOT 4 satellite data during the summer maize growing season from 2007 to 2013 with the exceptions of 2008 and 2011 when there was a lack of high-quality cloud-free images. In order to solve the spatial differences between SPOT 4 and Landsat data, Landsat images were resampled to 20-m resolution using the nearest neighbor method. Results indicate that remote sensing can provide reasonably reliable estimates of maize yields in this region. In addition, the majority of yield gap is dominated by transient factors, and shrinking this gap may require high-quality forecasts to make informed optimal management decisions.
Satellite remote sensing also has been used in crop classification and disease monitoring. Zhong et al. (2019) used data from Landsat 7 ETM+ and Landsat 8 OLI at 3- m resolution to classify summer crops. Two types of deep learning models were designed using Landsat Enhanced Vegetation Index (EVI) time series. Three widely used classifiers were also tested for comparison, including a gradient boosting machine called XGBoost, Random Forest, and SVM. Among non-deep-learning classifiers, XGBoost achieved the best result with 84.17% accuracy and an F1 score of 0.69. The model employs EVI time series by examining shapes at various scales in a hierarchical manner. Ma et al. (2019) discriminated winter wheat powdery mildew and aphid infestations during a co-epidemic outbreak of the disease and the insect pest in northeast China based on temporal Landsat 8 imagery integrated with crop growth and environmental parameters.
Using satellite monitoring, the system notifies its users of critical changes in vegetation, sends real-time weather risk alerts, and automates the prioritization process within field work planning tasks. As a result, all of the abovementioned capabilities make it possible not to miss important points in the treatment of fields and to respond in a timely manner to any changes. So far, researchers have implemented agricultural projects for monitoring fields, classifying crops, identifying growth and stress status, and forecasting crop yields (Zhou et al. 2019; Shen et al. 2019).
8.4.3 Application of Hyperspectral Remote Sensing
Advances in sensing and computer technologies have achieved great improvement in hyperspectral image data acquisition. A number of HSI data missions for Earth Observation have been launched and provide new tools for satellite remote sensing, such as the NASA Hyperspectral Infrared Imager (HyspIRI), the Environmental Mapping and Analysis Program (EnMAP), and the Precursore IperSpettrale della Missione Applicativa (PRISMA) program (Paoletti et al. 2019). Meanwhile, several instruments are used in capturing great volumes of HSI data based on airborne remote sensing. As shown in Table 8.3, some of best-known spectrometers are available for crop sensing.
The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), developed by the Jet Propulsion Laboratory (JPL) (Pasadena, California, USA), was a hyperspectral imaging sensor that delivered calibrated images of upwelling spectral radiance in 224 contiguous spectral bands with wavelengths from 400 to 2500 nm (http://aviris.jpl.nasa.gov/). Moreover, the Airborne Visible Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) sensor samples 430 contiguous bands between 380 nm and 2510 nm at approximately 5-nm spectral resolution.
Nagasubramanian et al. (2017) identified the disease named charcoal rot in soybean crops using AVIRIS hyperspectral data. In the range of 383–1032 nm, they developed a 3D convolutional neural network (CNN) model for soybean charcoal rot disease identification. The classification accuracy was 95.73% and the infected class F1 score was 0.87. Salas et al. (2020) derived a set of narrow−/broadband indices from the AVIRIS-NG imagery to represent spectral variations and identify target classes and their distribution patterns. The results showed that the maximum entropy (MaxEnt) and generalized linear model (GLM) had strong discriminatory image classification abilities with area under the curve (AUC) values ranging between 0.75 and 0.93 for MaxEnt and between 0.73 and 0.92 for GLM. It was also found that the Photochemical Reflectance Index (PRI) and Moment Distance Ratio Right/Left (MDRRL) were important predictors for target classes such as wheat, legumes, and eggplant.
The Compact Airborne Spectrographic Imager 1500 (CASI 1500), designed by ITRES Research Ltd. (Calgary, Alberta, Canada), is a system that acquires data in 380–1050 nm and splits light into 288 discrete bands. It was used to obtain images over a field that had been set up to study the effects of various nitrogen application rates and weed control on corn (Goel et al. 2003). The results indicated that the reflectance of corn was significantly influenced (α = 0.05) at certain wavelengths by the presence of weeds, the nitrogen rates, and their interaction. Differences in response due to nitrogen stress were most evident at 498 nm and in the band at 671 nm.
In addition, the HyMap scanner, built by Integrated Spectronics Pty Ltd. of Sydney, Australia, has four spectrometers in the interval of 450–2450 nm excluding the two major atmospheric water absorption windows. The research was conducted on estimating foliage nitrogen concentration from HyMap data using continuum-removal analysis (Huang et al. 2004). It identified the known nitrogen absorption features. The coefficient of determination increased from 0.65, using the standard derivative analysis, to 0.85 with the continuum-removal analysis. Mewes et al. (2011) indicted the potential to detect wheat disease induced by a pathogen infection. With the original spectral resolution of HyMap, the highest classification accuracy could be obtained by using 13 spectral bands with a Kappa coefficient of 0.59.
In summary, imaging spectrometers are of increasing importance for agricultural applications, particularly for the support of crop sensing that increases the productivity of crop stands (Zhou et al. 2019; Shen et al. 2019). However, to define an optimal sensor-based system or a data product designed for crop detection, it is necessary to know which spectral wavelengths are representative and which spectral resolution is needed. The methods of data processing also face the challenges from different instruments and requirements. Hence, research may involve data fusion and modelling supported by machine learning and even deep learning.
8.5 Precision Crop Management Based on Sensing Instruments
Spectroscopy and imaging sensors have been widely used to support precision agriculture by providing information for crop management (Zhang et al. 2018). It presents an automated solution of object recognition and detection in crop production, combined with technologies of machine vision and machine learning algorithms as well as deep learning systems (Gomes and Leta 2012; Kamilaris and Prenafeta-Boldú 2018). More and more agricultural robots have been developed based on crop sensing instruments and processing methods. They have been used in specific tasks that are traditionally performed manually in which manual methods have the disadvantages of being tedious and error-prone. Some recent advancements of crop sensors are applied in precision management in the field including variable sprayers for fertilizers and weed control and field-based crop phenotyping (Patricio and Rieder 2018).
8.5.1 Applications of Spectroscopy-Based Crop Sensors
8.5.1.1 Classification of Weeds and Damage Caused by Disease and Pests
Since reflectance of crops, weeds, and soil differs in the visual and NIR wavelengths, there is potential to distinguish them by spectral reflectance at different wavelengths. Vrindts et al. (2002) measured canopy reflectance of sugar beet, maize, and weeds with a line spectrograph (480–820 nm). Four wavelengths were selected to separate the sugar beet and weed plants including 572.7, 676.1, 801.4, and 814.6 nm. The overall classification accuracy was over 90%, while it had not shown good capability to classify maize and weeds with only 15% accuracy. Shirzadifar et al. (2018) selected bands around 1078, 1435, 1490, and 1615 nm to identify weeds of kochia, water hemp, and lamb’s-quarters.
In order to design an optical weed sensor, sensitive wavelengths within the visible and NIR bands (496, 546, 614, 676, and 752 nm) were selected based on the spectral differences between stems and leaves of various crops and weeds (Wang et al. 2001). The partial least-squares (PLS) calibration model was established by the combination of these wavelengths and their VIs. The designed instrument with the embedded model could identify wheat, bare soil, and weeds with classification rates of 100%, 100%, and 71.6%, respectively, for the training data set when the weed density was above 0.02 plants/cm2. Sui et al. (2008) developed a ground-based weed mapping system to measure weed intensity and distribution in a cotton field. It was used to directly output the canopy coverage and intensity ratio by connecting with a WeedSeeker sensor. The changes in leaf pigments and biochemical components caused by fungi infection or pest damage can influence the spectral characteristics of leaves, so that the spectral differences between healthy and damaged leaves can be used to identify the plant health status. Various VIs are used in monitoring plant disease and pests such as NDVI, GNDVI, and OSAVI (Zhang et al. 2012, 2019a). Based on the fluorescence spectra, some studies applied the ratio of fluorescence (e.g., F686/F740) amplitude at fluorescence peaks to achieve presymptomatic detection for some pathogens (Bürling et al. 2012). Parameters associated with the saturation pulse method could be used to evaluate the changes of affected pigments, such as the maximum quantum efficiency of photosystem II (PSII) primary photochemistry (Fv/Fm), the maximum efficiency of PSII photochemistry in light-adapted material (Fv’/Fm′), and non-photochemical quenching (NPQ). Besides VIS-NIR and fluorescence spectroscopy, thermal observation provides an indicator to find the temperature changes of stressed symptoms from plant canopy.
The sensitive features are important for detection of diseases or pests. Naidu et al. (2009) discussed the spectral characteristics of grape infected by grapevine leafroll disease (GLD). The spectral differences between healthy and infected leaves are located around the green (near 550 nm), shortwave NIR (near 900 nm), and NIR (near 1600 and 2200 nm) bands. The classification models were built based on the sensitive wavelengths (531, 570, 752 nm, etc.) and VIs (NDVI, RVSI, PRI, etc.). Moreover, the results showed that compared with the linear regression result of 0.72 from RVSI, the accuracy increased to 0.78 when RVSI was combined with the reflectance in the blue band (470–490 nm) and 526 nm. In the same study, the classification accuracy was 0.75 by the variables that combined PRI with bands of 765–830, 970, and 684 nm.
Similarly, Annamalai and Lee (2004) investigated the spectral signatures of immature green citrus fruit and leaves for the purpose of developing a spectrally based fruit identification and early yield mapping system. Diffuse reflectance of fruit and leaf samples were measured in the range of 400–2500 nm, and two important wavelengths at 815 and 1190 nm were selected. A ratio of these two wavelengths was used to distinguish immature green fruit from leaves. Other researchers studying leaf miner damage, bacterial spots, and yellow rust of crop leaves had examined the sensitivity of spectral responses and characteristics and established identification models by partial least squares (PLS) regression, stepwise multiple linear regression (SMLR), support SVM, and so on (Moshou et al. 2014). Recently, more and more statistical analysis and machine learning modeling methods are applied. Deep learning, a part of machine learning, has also been applied to select the features or to build an end-to-end architecture for discriminant analysis.
8.5.1.2 Monitoring of Nutrient Content and Biomass Status
Crop growth status is generally evaluated by the nutrient content and biomass level, in which the contents of chlorophyll, nitrogen, and water are related to the nutrient level, and the biomass is generally estimated by the leaf area index (LAI) referred to a unit area or volume of habitat. The estimation of crop growth parameters using spectroscopy helps to guide the management of fertilizer and irrigation and predict the yield in the field.
The chlorophyll measurement has always been the priority of considerable research because chlorophyll is the organic molecule of plant leaves for photosynthesis and highly relates with leaf nitrogen in the 400–700 nm spectral range (Ulissi et al. 2011). Using the same spectral features shown in Table 8.1, a large number of researchers estimate the chlorophyll content by sensitive wavelengths, VIs, red-edge location, and others. Ciganda et al. (2009) constructed a red-edge chlorophyll index with red-edge (720–730 nm) and NIR (770–800 nm) spectral reflectance. Chen et al. (2010) proposed a new spectral indicator named Double-peak Canopy Nitrogen Index (DCNI) which was used for maize nitrogen estimation. Schlemmer et al. (2013) indicated that the chlorophyll content could be accurately retrieved using green and red-edge chlorophyll indices by the bands located in the NIR (780–800 nm) and either the green (540–560 nm) or red edge (730–750 nm). Rossini et al. (2012) estimated chlorophyll using a suite of VIs and found a high correlation of over 0.8 between leaf chlorophyll content and narrowband spectral indices. Sonobe et al. (2018) showed that shading treatment for a crop made the reflectance lower near the wavelengths of 550 and 740 nm. Two methods, machine learning algorithms and the inversion of a radiative transfer model, were evaluated using measurements from tea leaves. Overall, the kernel-based extreme learning machine had the highest performance with a root mean square error (RMSE) of 3.04 ± 0.52 μg cm−2 and the ratios of performance to deviation (RPD) from 3.38 to 5.92 for the test set.
The molecular absorption of hydrogen-containing groups (O-H, N-H, C-H) provides a potential to measure the moisture content nondestructively (Cheng et al. 2011). Although water absorption has been explored in the infrared region with spectral centers at 970, 1200, 1440, and 1950 nm (Palmer and Williams 1974), a series of researchers proposed different wavelengths due to the influences of species, phenology, environment stress, and so on. Dejonge et al. (2016) established a diagnosis model of corn water content to guide the field irrigation using the NDVI, OSAVI, and GNDVI. Among these VIs, the NDVI showed the best performance with highest R2, slope almost equal to 1. So the vegetation ratios of water-stressed and non-stressed NDVI was set as an irrigation trigger with the threshold value of 0.93.
In addition, spectroscopy methods can be used to invert some biomass parameters and indirectly calculate LAI. Except the NDVI, other spectral indices have also been presented in recent research. Ray et al. (2006) found that VIs, NDVI, and SAVI (Soil-Adjusted Vegetation Index), calculated in the bands of 780–680 nm, produced the highest correlation coefficients with LAI. Han et al. (2016) built a model to predict the LAI of apple tree canopy by comparing SVM and random forest (RF) algorithms. Some VIs used in the RF regression model were in accordance with LAI in the full fruit period including GNDVI, NDIVI, RVI, and GRVI. Besides the VIS and NIR regions, Neinavaz et al. (2016) conducted some research in the thermal infrared region (TIR) and found that the canopy emissivity spectra increased with rising LAI.
In particular, the value of LAI could also be measured by an optical sensor, named LAI-2000 Plant Canopy Analyzer (LI-COR Biosciences, USA). It works by digital photography to show how canopy gap fraction measurements can be overestimated if measurements are taken when foliage is brightly lit (Han et al. 2016).
According to the studies mentioned above, the main methods include data preprocessing, sensitive parameter selection, and estimation modelling. The capabilities and performances of spectroscopy were explored for crop sensing. However, the methods used in data processing and the results were different, indicating that the sensors and algorithms used might influence the application significantly. Researchers will face further challenges on sensor integration and data fusion.
8.5.2 Applications of Imaging-Based Crop Sensors
8.5.2.1 Application of Ground-Based Imaging Instruments
8.5.2.1.1 Classification of Crops and Weeds
Focusing on the recognition of field weeds by different imaging sensors, Knoll et al. (2016a), used a time-of-flight (TOF) sensor, CamCube 3, to create depth images with a resolution of 204 × 204 pixels. In addition, more sensors were equipped on a field robot named Bonirob, including a Bispectral JAI camera, a Nikon D5300 camera, a Kinect II, and a laser scanner (Knoll et al. 2016b, c). The Bispectral JAI camera (JAI Ltd., Denmark) uses one lens for two cameras (RGB camera and IR camera) with 1296 × 966 pixels. The Nikon D5300 captures RGB images with a resolution of 6000 × 4000 pixels. Moreover, the Kinect II records a color image with the size of 1920 × 1080 pixels and an infrared image of 512 × 424 pixels. Meanwhile, the ToF technology allows a depth image of 512 × 424 pixels. In the research, the best performances were provided by the JAI camera and the Nikon camera. As a result, two VI determination methods based on RGB images were proposed by extracting color features of RGB and HSV (hue, saturation, value) (Knoll et al. 2016b).
However, the disturbances of results are the influences of, for example, weather, the various stages of growth, the large number of different weeds, and the different soil conditions. In order to eliminate these influences, a self-learning convolutional neural network was used for weed recognition in the field. This deep-learning approach achieved accuracy of over 98% (Knoll et al. 2018). Similarly, a classification model of weeds in organic carrot was proposed by using a convolutional neural network (CNN) to help in weed management (Knoll et al. 2019). Several proposed methods also indicated that deep learning could help to extract high-level features from images to improve the classification accuracy (Asad and Bais 2019; Peng et al. 2019).
8.5.2.1.2 Identification of Specialty Fruits
Harvesting of specialty fruits such as apples, citrus, cherries, and pears is highly labor intensive and is becoming less sustainable with increasing cost and decreasing availability of a skilled labor force (Gongal et al. 2015).
In order to help harvesting and yield prediction of specialty fruits, a digital SLR camera (EOS Rebel T2i, Canon Inc., Japan) with an 18–55 mm lens was used to collect the RGB images of field blueberry with 3648 × 2736 pixels. Li et al. (2014a, b) selected three color components, red (R), blue (B), and hue (H), to separate fruits of four maturity stages from background through different classifiers. The performances were discussed among the results of the K-nearest neighbor (KNN), naïve Bayesian classification (NBC), and supervised K-means clustering classifier (SK-means). In this work, the KNN classifier yielded the highest classification accuracy (85–98%) from the validation set.
In the immature green citrus fruit detection, Gan et al. (2018) built an imaging system to provide valuable information for yield estimation at earlier stages. The system consisted of two color cameras (USB 3.0, The Imaging Source, Charlotte, NC, USA) and a thermal camera (A655sc, FLIR, Wilsonville, OR, USA). Images from all three cameras had the same spatial resolution, 640 × 480 pixels, and very similar diagonal field of views of about 30°. A new Color-Thermal Combined Probability (CTCP) algorithm was created to effectively fuse information from the color and thermal images to classify potential image regions into fruit and non-fruit classes. The results present that the fusion of the color and thermal images effectively improved the accuracy of immature green citrus fruit detection. For the same aim, Okamoto and Lee (2009) used a hyperspectral camera of 369–1042 nm to acquire hyperspectral images of green fruits of three different citrus varieties (tangelo, Valencia, and Hamlin). Spatial image processing steps (noise reduction filtering, labeling, and area thresholding) were applied. The results of pixel identification tests showed that the detection success rates were 70–85%, depending on citrus varieties.
8.5.2.1.3 Measurement of Crop Growth Status
Three-dimensional cameras have been used to obtain the depth or position information. The Kinect camera has a normal webcam and a depth sensor which can provide RGB-D image. The depth sensor consisted of an infrared laser projector combined with a monochrome CMOS sensor that could detect the range of 0.8–4.0 m. Sa et al. (2016) used a Kinect camera to capture RGB and NIR images. A Faster Region-based CNN (Faster R-CNN) model was established to detect sweet peppers, which took into account both precision and recall performances improving from 0.807 to 0.838. Kang and Chen (2020) used a RealSense D-435 camera to collect RGB and depth images for apple detection in the orchard. From the experiment results, DaSNet-v2 with ResNet-101 achieved 0.868, 0.88, and 0.873 on recall and precision of detection and accuracy of instance segmentation on fruits, respectively. In addition, it reached 0.794 on the accuracy of branch segmentation.
Although the measurement of plants is traditionally based on RGB images, the information of plant appearance is more accurately presented in 3D space, especially for geometry and topology. So the 3D imaging instruments are increasingly used in the crop phenotyping. LiDAR or laser sensors have been used to measure plant height and biomass because they present good adaptation to illumination and provide considerable data. LiDAR was adopted to measure the height and biomass of rice, oilseed rape, winter rye, winter wheat, and grassland (Tilly et al.,2015).
In order to estimate the nutrient content, an AD-130 GE bispectral camera (JAI Ltd., Denmark) was also used to capture multispectral images of RGB and NIR. Fifteen image parameters were extracted including the average gray values of images, the VIs (NDVI, NDGI, RVI, DVI), and image texture parameters (energy, moment of inertia, correlation, entropy, etc.). An SVM model was estimated to provide support for corn nutritional diagnosis and fertilization management decisions.
In order to evaluate the nitrogen content in oilseed rape (Brassica napus L.), Zhu et al. (2019) collected spectral images in 900–1700 nm wavebands using a hyperspectral camera, ImSpectorN17E (Spectral Imaging Ltd., Oulu, Finland). A fast nitrogen content grade classification method for oilseed rape canopy was established by employing a deep learning algorithm named stacked auto-encoders (SAEs). In this study, the SAE algorithm was introduced for the data dimensional reduction and feature extraction from hyperspectral images, and then the multiple classification models were applied for the feature testing and validation within the feature data under different camera angles with different feature units. Results showed that the best accuracy was presented by data captured under the 25° angle.
Hyper SIS (Zolix Instruments Co., Ltd., Beijing, China) is a hyperspectral camera to measure in the 369–988 nm band with a spectral resolution of 1.2 nm. It was used to detect the nitrogen detection of longan plants (Yue et al. 2018). The initial features were extracted using the principle component analysis (PCA) to identity a number of potential characteristic wavelengths (483, 518, 625, 631, 642, and 675 nm). Then the texture based on the gray-level co-occurrence matrix (GLCM) was extracted from those images. Combined with the state-of-the-art deep learning technology, a distribution model of chlorophyll content for longan leaves based on convolution neural networks (CNNs) and deep neural networks (DNNs) was proposed. As a result, the R2 of the calibration and validation set were 0.84 and 0.82, respectively.
In the detection of rice panicle blast disease, Huang et al. (2017) measured images in the band of 400–1000 nm by using a Gaia Field-F-V10 (Spectral Imaging Ltd., Finland) spectrometer. A deep convolutional neural network model, Google Net, was used to learn the representation of hyperspectral image data. The proposed method achieved a classification accuracy of 92.0%. The same hyperspectral camera has also been used in the nutrient monitoring of water or chlorophyll content (Liu et al. 2018a, b; Zheng et al. 2018).
8.5.2.2 Application of UAV-Based Imaging Instruments
With the development of remote sensing technology, the advantages of UAVs acquiring farmland images are fast and convenient. Furthermore, the scope of acquisition is gradually becoming an important means and research hotspot for farmland information acquisition (Yang et al. 2014, 2017).
Different types of spectroscopic and image sensors for UAV have been developed, such as digital color sensors and multispectral/hyperspectral imaging sensors, further extending UAV-based remote sensing to various applications (Lu et al. 2019).
8.5.2.2.1 Crop Classification
Based on digital color cameras, Yang et al. (2014) designed a multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame CMOS sensor with 5616 × 3744 pixels. One camera captures normal color images, while the other is modified to obtain NIR images. Images are stored in 14-bit RAW and 8-bit JPEG files in CompactFlash cards. The system has been practically applied in estimating crop canopy cover, detecting cotton root rot, and mapping henbit and giant reed infestations.
In order to classify different crops, Ferreira et al. (2019) used a Phantom DJI 3 Professional drone (DJI Technology Co. Ltd., China) to collect RGB images with 4000 × 3000 pixels to train a model and achieved 97% accuracy in the discrimination of grass and broadleaf. Based on this goal, Wang et al. (2019) equipped a multispectral camera on a composite wing UAV to collect images of cotton, corn, and squash. A Micro MCA12 Snap (Tetracam, CA, USA) obtained images at 12 bands of 470, 515, 550, 610, 656, 710, 760, 800, 830, 860, 900, and 950 nm, with 1280 × 1024 pixels in each band. A CNN network was designed to extract features and classify crops. Compared with the SVM based on radial basis kernel function and the backpropagation neural network, the optimized CNN had the best effect and the highest classification accuracy of 97.75%.
8.5.2.2.2 Crop Detection
Due to the requirement of nutrient estimation, Qiao et al. (2019) estimated the chlorophyll content of maize. RGB images with a resolution of 7360 × 4912 pixels were collected by using ILCE-7R (Sony Corporation, Japan) equipped on a DJI M600 UAV platform. The parameters related to the color and texture features in the images were extracted after the canopy segmentation to reduce influences from the background. The established model had a determination coefficient of 0.76. The distribution map of chlorophyll content in field maize canopy was drawn based on a pseudo-color technique. It provided a tool to visually distinguish the field road and canopy area, showing the difference in chlorophyll distribution of the plot.
Potgieter et al. (2017) conducted the assessment of seasonal leaf area dynamics of sorghum breeding lines by using multispectral imaging from an UAV. A RedEdge™ narrowband multispectral camera (MicaSense Inc., USA) capturing five bands at specific nanometer (nm) wavelength peaks was fitted to the UAV platform. The bands captured were blue (B: 475 nm center wavelength, 20 nm bandwidth), green (G: 560 nm, 20 nm), red (R: 668 nm, 10 nm), red edge (RE: 717 nm, 10 nm), and near-infrared (NIR: 840 nm, 40 nm). The horizontal field of view was 47.2 degrees with a 5.5-mm focal length producing an image resolution of 1280 × 960 pixels. It was found that the good correlations between each VI (NDVI and EVI) and each growth parameter, such as plant number per plot, canopy cover, and LAI both during the vegetative growth phase (pre-anthesis) and at maximum canopy cover shortly after anthesis. The NDRE, which is used to estimate leaf chlorophyll content, was also the most useful in characterizing the leaf area dynamics/senescence patterns of contrasting genotypes.
Cen et al. (2019) discussed the use of a lightweight UAV with dual image-frame snapshot cameras to estimate aboveground biomass (AGB) and panicle biomass (PB) of rice at different growth stages with different nitrogen (N) treatments. An RGB camera (NEX-7 camera, Sony Corporation, Japan) with a spatial resolution of 6000 × 4000 pixels and a snapshot multispectral camera (CMV2 K CMOS, IMEC, Chatsworth, Leuven, Belgium) with a spatial resolution of 409 × 216 pixels coupled with a three-axis gimbal were mounted on the UAV. The multispectral camera contains 25 wavelengths in the spectral region of 600–1000 nm (679, 693, 719, 732, 745, 758, 771, 784, 796, 808, 827, 839, 84, 860, 871, 880, 889, 898, 915, 922, 931, 937, 944, 951, and 956 nm). It was found that the canopy height extracted from the crop surface model exhibited a high correlation with the ground-measured canopy height, and several VIs were highly correlated with AGB.
These applications show that imaging instruments are being widely used on-board UAVs for collecting spectral and spatial information that allows the generation of maps to indicate the aspects of the plant state. Due to the availability of NIR wavelengths in multispectral images, spectral images have also become an indispensable tool for evaluating the physiological- and biochemical-related parameters of plants, such as LAI, vegetation fraction, nitrogen (N) and chlorophyll status, net photosynthesis, and biomass.
8.5.3 Variable-Rate Fertilizer Management Based on Crop Sensors
8.5.3.1 Variable-Rate Fertilizer Mapping Based on Imaging Instruments
In order to further extend the functions of the crop growth detector, a WSN-based detection system was proposed to measure crop spectral characteristics on the go and in real time as shown in Fig. 8.9. The controller was an industrial personal computer (IPC) with an attached ZigBee wireless communication module (JN5139 module). As the coordinator of the whole wireless network, it was used to establish the wireless network, waiting for sensor nodes to join in, and receiving, displaying, and storing all the data from different sensor nodes.
The measuring unit consisted of several optical sensors, and each optical sensor was used as a sensor node in this WSN. Each sensor node consisted of an optical part and a circuit part. The optical part contained eight optical channels at four wavebands. Since the detection system used sunlight as a light source, besides the reflected light from crop canopy, the sunlight intensity should also be measured as a reference. Therefore, two solutions were put forward:
-
1.
A full-function sensor node had to contain eight optical channels, upward four for the sunlight and downward four for the reflected light.
-
2.
As shown in Fig. 8.9, one sensor node was selected to measure the sunlight as the type I sensor, and other sensor nodes were used to measure the reflected light as the type II sensors.
As discussed above, the independence of the sensor (type I) was selected to measure the sunlight, and then the whole network shared the sunlight data. Under the premise of measurement precision, this type of design greatly reduced the cost of the system. Thus, sensors and the controller can set up a communication network in many ways. The networking mode between handheld and vehicle-mounted systems can be transformed into each other. The transmission distances can be up to hundreds of meters, which realized the real-time, continuous measurements of crops in the field. Furthermore, it increased the flexibility of the detector installation.
The new system increased the optical channels and was realized to measure the crop spectral characteristics on the go and in real time after being installed on an on-board mechanical structure (Zhong et al. 2014). Referring to the field test in Shaanxi Province, China, the distribution of the chlorophyll content of wheat detected by the new system is shown in Fig. 8.10a (Sun et al. 2015). In this way, it provides the automatic mapping of comprehensive growth status in the field. Combined with the fertilizer decision strategy such as the yield prediction method, the fertilizer recommendation map could also be used as an output as shown in Fig. 8.10b.
8.5.3.2 Variable-Rate Fertilizer Control Based on Crop Sensing
Variable-rate fertilization technology improves the operational efficiency and utilization rate of a fertilizer and accelerates the sustainable development of modern agriculture to promote high-yield, superior-quality production while ensuring sufficient environmental protection. The crop sensors discussed in this chapter show the great potential to control the fertilizer rate in the field. Therefore, lots of variable-rate fertilizer applicators or sprayers are developed based on those sensors.
Commercial products, such as GreenSeeker products (Trimble Navigation Limited, Sunnyvale, CA, USA), Crop Circle devices (Holland Scientific Inc., Lincoln, Nebraska, USA) and Yara N-Sensors (Yara International ASA, Germany), promote solutions for variable-rate fertilization. However, the models of crop estimation and fertilizer decision are fixed in such systems. Hence, it might limit the applications of specific requirements such as crop diversities or regions.
In order to provide a more flexible system for precision fertilization, the multi-fusion sensor which was developed based on the combination of spectroscopy and imagery technology was applied in a fertilizer sprayer by China Agricultural University (Sun et al. 2018). The sensor was designed to measure the spectral reflectance in the red and NIR ranges, such as 610, 680, 730, 760, 810, and 860 nm, each with 20 nm of full-width half-max detection. More than ten kinds of VIs could be calculated by these data for crop monitoring. It means that the sensor could provide more flexible and modifiable models for different requirements of crop estimation. The RGB image was captured to estimate the canopy coverage so as to help determine the location during field measurement. The transmission method had been modified from the Wi-Fi to the CAN-bus, which has the advantages of long data-transmitting distance, fast speed, reliable transmit, and low cost. The sensing system is shown in Fig. 8.11a. A GPS model helps to record the detecting location, one of the sensors is used to calibrate the changes of sunlight, and crop sensors transmit data to the IPC by CAN-bus.
Generally, as shown in Fig. 8.11b during the fertilization process, the NDVI values of the crop canopy are acquired in real time by crop sensors. These values are transmitted to the vehicle-mounted IPC terminal through the CAN-bus cable. A variable-rate fertilization expert decision system preset into the IPC is run based on the model to generate optimal fertilizer rate in real time.
In this chapter, sensing principles and applied sensors based on spectroscopy and imagery are reviewed. Some developed sensors have been introduced and demonstrated to show the frontier research in this area. Numerous researchers in the cited literature have documented the practical applications of these sensors in many scenarios, including handheld detection, vehicle-mounted diagnosis, and remote sensing by UAVs or satellites, to build reliable prediction models of complex and uncertain phenomena in agriculture. With the integration of variable-rate technology, more and more precision management measures can be taken based on crop sensing methods. Recently, more and more new sensors and machine learning methods are applied in crop monitoring. These applications show the new trends for crop sensing. Smart crop sensors with artificial intelligence (AI) processors or deep learning models should emerge soon to improve the sensing accuracy or broaden the applications in the future.
Notes
- 1.
Disclaimer: Commercial products are referred to solely for the purpose of clarification and should not be construed as being endorsed by the authors or the institution with which the authors are affiliated.
References
Ali H, Lali MI, Nawaz MZ, Sharif M, Saleem BA (2017) Symptom based automated detection of citrus diseases using color histogram and textural descriptors. Comput Electron Agric 138:92–104
Annamalai P, Lee WS (2004) Identification of green citrus fruits using spectral characteristics, ASAE paper no. FL04-1001. ASAE, St. Joseph, Mich
Asad MH, Bais A (2019) Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Information Processing in Agriculture. https://doi.org/10.1016/j.inpa.2019.12.002
Barbedo JG (2016) A review on the main challenges in automatic plant disease identification based on visible range images. Biosyst Eng 144:52–60
Bauriegel E, Brabandt H, Garber U, Herppicha WB (2014) Chlorophyll fluorescence imaging to facilitate breeding of Bremia lactucae-resistant lettuce cultivars. Comput Electron Agric 105:74–82
Bürling K, Hunsche M, Noga G (2012) Presymptomatic detection of powdery mildew infection in winter wheat cultivars by laser-induced fluorescence. Appl Spectrosc 66(12):1411–1419
Calderón R, Navas-Cortés JA, Lucena C, Zarco-Tejada PJ (2013) High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens Environ 139:231–245
Cao Q, Cui Z, Chen X, Khosla R, Dao TH, Miao Y (2012) Quantifying spatial variability of indigenous nitrogen supply for precision nitrogen management in small scale farming. Precis Agric 13:45–61
Cao Q, Miao Y, Huang S, Wang H, Khosla R, Jiang R (2013) Estimating rice nitrogen status with the crop circle multispectral active canopy sensor. In: Stafford JV (ed) Precision agriculture ‘13. Wageningen Academic Publishers, Wageningen, pp 95–101
Cao Q, Miao Y, Li F, Gao X, Liu B, Lu D, Chen X (2017) Developing a new crop circle active canopy sensor-based precision nitrogen management strategy for winter wheat in North China plain. Precis Agric 18(1):2–18
Cayrol P, Chehbouni A, Kergoat L, Dedieu G, Mordelet P, Nouvellon Y (2000) Grassland modeling and monitoring with SPOT-4 VEGETATION instrument during the 1997-1999 SALSA experiment. Agric For Meteorol 105(1–3):91–115
Cen H, Wan L, Zhu J, Li Y, Li X, Zhu Y, Weng H, Wu W, Yin W, Xu C, Bao Y, Feng L, Shou J, He Y (2019) Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras. Plant Methods 15(1). https://doi.org/10.1186/s13007-019-0418-8
Chen D, Huang J, Jackson TJ (2005) Vegetation water content estimation for corn and soybeans using spectral indices derived from MODIS near- and short-wave infrared bands. Remote Sens Environ 98(2):225–236
Chen PF, Haboudance D, Tremblay N, Wang JH, Vigneault P, Li BG (2010) New spectral indicator assessing the efficiency of crop nitrogen treatment in corn and wheat. Remote Sens Environ 114(9):1987–1997
Cheng T, Rivard B, Sanchez-Azofeifa A (2011) Spectroscopic determination of leaf water content using continuous wavelet analysis. Remote Sens Environ 115(2):659–670
Cheng M, Zhang J, Li M, Liu H, Sun H, Zheng T (2017) Chlorophyll content diagnosis model of winter wheat at heading stage applied in miniature spectrometer. Trans Chinese Soc Agric Eng 33(z1):157–163
Ciganda V, Gitelson A, Schepers J (2009) Non-destructive determination of maize leaf and canopy chlorophyll content. J Plant Physiol 166(2):157–167
Cui D, Zhang Q, Li M, Zhao Y, Hartman GL (2009) Detection of soybean rust using a multispectral image sensor. Sens & Instrumen Food Qual 3(1):49–56
Dejonge KC, Mefford BS, Chavez JL (2016) Assessing corn water stress using spectral reflectance. J Remote Sens 37(10):2294–2312
Ferreira AD, Freitas DM, da Silva GG, Pistori H, Folhes MT (2019) Unsupervised deep learning and semi-automatic data labeling in weed discrimination. Comput Electron Agric 165:104963
Gan H, Lee WS, Alchanatis V, Ehsani R, Schueller JK (2018) Immature green citrus fruit detection using color and thermal images. Comput Electron Agric 152:117–125
Garcia JA, Pope C, Altimiras F (2017) A distributed K-means segmentation algorithm applied to Lobesia botrana recognition. Complexity. https://doi.org/10.1155/2017/5137317
Gholizadeh A, Saberioon M, Borůvka L, Wayayok A, Soom MAM (2017) Leaf chlorophyll and nitrogen dynamics and their relationship to lowland rice yield for site-specific paddy management. Inform Process Agric 4(4):259–268
Goel PK, Prasher SO, Landry JA, Patel RM, Bonnell RB, Viau AA, Miller JR (2003) Potential of airborne hyperspectral remote sensing to detect nitrogen deficiency and weed infestation in corn. Comput Electron Agric 38(2):99–124
Gomes JFS, Leta FR (2012) Applications of computer vision techniques in the agriculture and food industry: a review. Eur Food Res Technol 235(6):989–1000
Gongal A, Amatya S, Karkee M, Zhang Q, Lewis K (2015) Sensors and systems for fruit detection and localization: a review. Comput Electron Agric 116:8–19
Han Z, Zhu X, Fang X, Wang Z, Wang L, Zhao G, Jiang Y (2016) Hyperspectral estimation of apple tree canopy LAI based on SVM and RF regression. Spectrosc Spectr Anal 36(3):800–805
Han L, Zhang Y, Qin Q (2019) Endmember extraction of farmland hyperspectral image using deep learning autoencoder and shuffled frog leaping algorithm. Trans Chinese Soc Agric Eng 35(6):167–173
Huang Z, Turner BJ, Dury SJ, Wallis IR, Foley WJ (2004) Estimating foliage nitrogen concentration from HYMAP data using continuum removal analysis. Remote Sens Environ 93(1):18–29
Huang J, Liao H, Zhu Y, Sun J, Sun Q, Liu X (2012) Hyperspectral detection of rice damaged by rice leaf folder (Cnaphalocrocis medinalis). Comput Electron Agric 82:100–107
Huang S, Sun C, Qi L, Ma X, Wang W (2017) Rice panicle blast identification method based on deep convolution neural network. Trans Chinese Soc Agric Eng 33(20):169–176
Jawad HM, Nordin R, Gharghan SK, Jawad AM, Ismail M (2017) Energy-efficient wireless sensor networks for precision agriculture: a review. Sensors 17(8):1781
Jiang H, Wang P, Zhang Z, Mao W, Zhao B, Qi P (2018) Fast identification of field weeds based on deep convolutional network and binary hash code. Trans Chinese Soc Agric Machin 49(11):30–38
Kamilaris A, Prenafeta-Boldú FX (2018) Deep learning in agriculture: a survey. Comput Electron Agric 147(1):70–90
Kang H, Chen C (2020) Fruit detection, segmentation and 3D visualisation of environments in apple orchards. Comput Electron Agric 171:105302
Kebapci H, Yanikoglu B, Unal G (2011) Plant image retrieval using color, shape and texture features. Comput J 54(9):1475–1490
Knoll FJ, Holtorf T, Hussmann S (2016a) Investigation of different plant root exit point vector search algorithms in organic farming. IEEE Trans Instrum Meas 65(5):1035–1041
Knoll FJ, Holtorf T, Hussmann S (2016b) Vegetation index determination method based on color room processing for weed control applications in organic farming. In: 2016 IEEE international instrumentation and measurement technology conference proceedings. https://doi.org/10.1109/I2MTC.2016.7520508
Knoll FJ, Holtorf T, Hussmann S (2016c) Investigation of different sensor systems to classify plant and weed in organic farming applications. SAI Comput Conference. https://doi.org/10.1109/SAI.2016.7556004
Knoll FJ, Czymmek V, Poczihoski S, Holtorf T, Hussmann S (2018) Improving efficiency of organic farming by using a deep learning classification approach. Comput Electron Agric 153:347–356
Knoll FJ, Czymmek V, Harders LO, Hussmann S (2019) Real-time classification of weeds in organic carrot production using deep learning algorithms. Comput Electron Agric 167:105097
Kuckenberg J, Tartachnyk I, Noga G (2009) Detection and differentiation of nitrogen-deficiency, powdery mildew and leaf rust at wheat leaf and canopy level by laser-induced chlorophyll fluorescence. Biosyst Eng 103(2):121–128
Latif A, Rasheed A, Sajid U, Ahmed J, Ali N, Ratyal NI, Zafar B, Dar SH, Sajid M, Khalil T (2019) Content-based image retrieval and feature extraction: a comprehensive review. Math Problems Eng:2019. https://doi.org/10.1155/2019/9658350
Li H, Lee WS, Wang K (2014a) Identifying blueberry fruit of different growth stages using natural outdoor color images. Comput Electron Agric 106:91–101
Li L, Zhang Q, Huang D (2014b) A review of imaging techniques for plant phenotyping. Sensors 14(11):20078–20111
Liu B, Zhang Y, He D, Li Y (2017) Identification of apple leaf diseases based on deep convolutional neural networks. Symmetry 2017:10(1). https://doi.org/10.3390/sym10010011
Liu H, Li M, Zhang J, Gao D, Sun H, Yang L (2018a) Estimation of chlorophyll content in maize canopy using wavelet denoising and SVR method. Int J Agric Biol Eng 11(6):132–137
Liu N, Wu L, Chen L, Sun H, Dong Q, Wu J (2018b) Spectral characteristics analysis and water content detection of potato plants leaves. IFAC-PapersOnLine 51(17):541–546
Liu N, Liu G, Sun H (2020) Real-time detection on SPAD value of potato plant using an in-field spectral imaging sensor system. Sensors 20(12):3430. https://doi.org/10.3390/s20123430
Long Y (2020) Development of crop condition detection system based on spectrum fusion. Master’s thesis, China Agricultural University, Beijing, China
Lowe DG (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110
Lu N, Zhou J, Han Z, Li D, Cao Q, Yao X, Tian Y, Zhu Y, Cao W (2019) Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods 15(1):17. https://doi.org/10.1186/s13007-019-0402-3
Luo J, Huang W, Zhao J, Zhang J, Zhao C, Ma R (2013) Detecting aphid density of winter wheat leaf using hyperspectral measurements. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 6(2):690–698
Ma H, Huang W, Jing Y, Yang C, Han L, Dong Y, Ye H, Shi Y, Zheng Q, Liu L, Ruan C (2019) Integrating growth and environmental parameters to discriminate powdery mildew and aphid of winter wheat using bi-temporal Landsat-8 imagery. Remote Sens 11(7):846
Maimaitijiang M, Sagan V, Sidike P, Hartling S, Esposito F, Fritschi FB (2020) Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens Environ 237(2):111599
Mavridou E, Vrochidou E, Papakostas GA, Pachidis T, Kaburlasos VG (2019) Machine vision systems in precision agriculture for crop farming. J Imag 5(12):89
Mewes T, Franke J, Menz G (2011) Spectral requirements on airborne hyperspectral remote sensing data for wheat disease detection. Precis Agric 12:795–812
Mishra P, Asaari MSM, Herrero-Langreo A, Lohumi S, Diezma B, Scheunders P (2017) Close range hyperspectral imaging of plants: a review. Biosyst Eng 164:49–67
Moshou D, Pantazi X, Kateris D, Gravalos I (2014) Water stress detection based on optical multisensor fusion with a least squares support vector machine classifier. Biosyst Eng 117:15–22
Munoz-Huerta RF, Guevara-Gonzalez RG, Contreras-Medina LM, Torres-Pacheco I, Prado-Olivarez J, Ocampo-Velazquez RV (2013) A review of methods for sensing the nitrogen status in plants: advantages, disadvantages and recent advances. Sensors 13(8):10823–10843
Nagasubramanian K, Jones S, Singh AK, Singh A, Ganapathysubramanian B, Sarkar S (2017) Explaining hyperspectral imaging based plant disease identification: 3D CNN and saliency maps. 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA
Naidu RA, Perry EM, Pierce FJ, Mekuria T (2009) The potential of spectral reflectance technique for the detection of grapevine leafroll-associated virus-3 in two red-berried wine grape cultivars. Comput Electron Agric 66(1):38–45
Narvaez FY, Reina G, Torres-Torriti M, Kantor G, Cheein FA (2017) A survey of ranging and imaging techniques for precision agriculture phenotyping. IEEE-ASME Trans Mechatron 222(6):2428–2439
Neinavaz E, Darvishzadeh R, Skidmore AK, Groen TA (2016) Measuring the response of canopy emissivity spectra to leaf area index variation using thermal hyperspectral data. Int J Appl Earth Obs Geoinform 53:40–47
Netto AT, Campostrini E, De Oliveira JG, Bressan-SmithR.E. (2005) Photosynthetic pigments, nitrogen, chlorophyll a fluorescence and SPAD-502 readings in coffee leaves. Sci Hortic 104(2):199–209
Okamoto H, Lee WS (2009) Green citrus detection using hyperspectral imaging. Comput Electron Agric 66(2):201–208
Oppenheim D, Edan Y, Shani G (2017) Detecting tomato flowers in greenhouses using computer vision. World Acad Sci Eng Technol/Int J Comput Electric Autom Contr Inform Eng 11(1):104–109
Pajares G, Garcia-Santillan I, Campos Y, Montalvo M, Guerrero JM, Emmi L, Romeo J, Guijarro M, Gonzalez-de-Santos P (2016) Machine-vision systems selection for agricultural vehicles: a guide. J Imag 2(4):34
Pallottino F, Antonucci F, Costa C, Bisaglia C, Figorilli S, Menesatti P (2019) Optoelectronic proximal sensing vehicle-mounted technologies in precision agriculture: a review. Comput Electron Agric 162:859–873
Palmer KF, Williams D (1974) Optical properties of water in the near infrared. J Opt Soc Am 64(8):1107–1110
Paoletti ME, Haut JM, Plaza J, Plaza A (2019) Deep learning classifiers for hyperspectral imaging: a review. ISPRS J Photogramm Remote Sens 158:279–317
Patricio DI, Rieder R (2018) Computer vision and artificial intelligence in precision agriculture for grain crops: a systematic review. Comput Electron Agric 153:69–81
Peng M, Xia J, Peng H (2019) Efficient recognition of cotton and weed in field based on faster R-CNN by integrating FPN. Trans Chinese Soc Agric Eng 35(20):202–209
Potgieter A, George-Jaeggli B, Chapman SC, Laws K, Suárez Cadavid LA, Wixted J, Watson J, Eldridge M, Jordan DR, Hammer GL (2017) Multi-spectral imaging from an unmanned aerial vehicle enables the assessment of seasonal leaf area dynamics of sorghum breeding lines. Front Plant Sci 8:1532
Priyankara HA, Withanage DK (2015) Computer assisted plant identification system for android. Moratuwa Engineering Research Conference. https://doi.org/10.1109/MERCon.2015.7112336
Qiao L, Zhang Z, Chen L, Sun H, Li M, Li L, Ma J (2019) Detection of chlorophyll content in maize canopy from UAV imagery. IFAC-Papers OnLine 52(30):330–335
Ray SS, Das G, Singh JP, Panigrahy S (2006) Evaluation of hyperspectral indices for LAI estimation and discrimination of potato crop under different irrigation treatments. Int J Remote Sens 27(24):5373–5387
Rossini M, Cogliati S, Meroni M, Migliavacca M, Galvagno M, Busetto L, Cremonese E, Julitta T, Siniscalco C, Morra di Cella U, Colombo R (2012) Remote sensing-based estimation of gross primary production in a subalpine grassland. Biogeosciences 9(7):2565–2584
Sa I, Ge ZY, Dayoub F, Upcroft B, Perez T, McCool C (2016) DeepFruits: a fruit detection system using deep neural networks. Sensors 16(8):122
Salas EAL, Subburayalu SK, Slater B, Zhao K, Bhattacharya B, Tripathy R, Das A, Nigam R, Dave R, Parekh P (2020) Mapping crop types in fragmented arable landscapes using AVIRIS-NG imagery and limited field data. Int J Image Data Fusion 11(1):33–56
Sankaran S, Mishra A, Ehsani R, Davis C (2010) A review of advanced techniques for detecting plant diseases. Comput Electron Agric 72(1):1–13
Schepers JS, Francis DD, Vigil M, Below FE (1992) Comparison of corn leaf nitrogen concentration and chlorophyll meter readings. Commun Soil Sci Plant Anal 23(17–21):2173–2187
Schlemmer M, Gitelson A, Schepers J, Ferguson R, Peng Y, Shanahan J, Rundquist D (2013) Remote estimation of nitrogen and chlorophyll contents in maize at leaf and canopy levels. Int J Appl Earth Obs Geoinf 25:47–54
Serrano JM, Shahidian S, da Silva JRM (2016) Monitoring pasture variability: optical OptRx ® crop sensor versus Grassmaster II capacitance probe. Environ Monit Assess 188(2):117
Sharabian VR, Noguchi N, Hanya I, Ishii K (2013) Evaluation of an active remote sensor for monitoring winter wheat growth status. Eng Agric Environ Food 6(3):118–127
Shen R, Huang A, Li B, Guo J (2019) Construction of a drought monitoring model using deep learning based on multi-source remote sensing data. Int J Appl Earth Obs Geoinf 79(7):48–57
Shirzadifar A, Bajwa S, Mireei SA, Howatt K, Nowatzki J (2018) Weed species discrimination based on SIMCA analysis of plant canopy spectral data. Biosyst Eng 171:143–154
Singh M, Kumar R, Sharma A, Singh B, Thind SK (2015) Calibration and algorithm development for estimation of nitrogen in wheat crop using tractor mounted N-sensor. Sci World J:163968. https://doi.org/10.1155/2015/163968
Singh A, Ganapathysubramanian B, Singh AK, Sarkar S (2016) Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci 21(2):110–124
Singh AK, Ganapathysubramanian B, Sarkar S, Singh A (2018) Deep learning for plant stress phenotyping: trends and future perspectives. Trends Plant Sci 23(10):883–898
Sonobe R, Sano T, Horie H (2018) Using spectral reflectance to estimate leaf chlorophyll content of tea with shading treatments. Biosyst Eng 175:168–182
Stoll M, Schultz HR, Baecker G, Berkelmann-LoehnertzB (2008) Early pathogen detection under different water status and the assessment of spray application in vineyards through the use of thermal imagery. Precis Agric 9(6):407–417
Sui R, Thomasson JA, Hanks J, Wooten J (2008) Ground-based sensing system for weed mapping in cotton. Comput Electron Agric 60(1):31–38
Sun H, Wu Q, Li M, Zhao R, Zheng L (2013) Development of crop monitoring system using 2-channel CCD image sensor. 2013 ASABE annual international meeting, Kansas City, Missouri, July 21, 2013, paper number: 131620020
Sun H, Zhang M, Pei X, Yang W, Wen Y, Zhao Y, Li M (2015) Development of a spectral measurement system for crop detection. 2015 ASABE annual international meeting, New Orleans, Louisiana, July 26-29, 2015, paper number: 152188957
Sun Z, Sun H, Liu H, Zhang J, Che L, Li M, Zheng L, Wang X (2018) Performance test and parameter optimization of variable spraying liquid fertilizer machine. IFAC-PapersOnLine 51(17):118–123
Sun H, Liu N, Xing Z, Zhang Z, Li M, Wu J (2019a) Parameter optimization of potato spectral response characteristics and growth stage identification. Spectrosc Spectr Anal 39(6):1870–1877
Sun H, Xing Z, Zhang Z, Ma X, Long Y, Liu N, Li M (2019b) Visualization analysis of crop spectral index based on RGB-NIR image matching. Spectrosc Spectr Anal 39(11):3493–3500
Tartachnyk II, Rademacher I, Kuhbauch W (2006) Distinguishing nitrogen deficiency and fungal infection of winter wheat by laser-induced fluorescence. Precis Agric 7(4):281–293
Taskos DG, Koundouras S, Stamatiadis S, Zioziou E, Nikolaou N, Karakioulakis K, TheodorouN (2015) Using active canopy sensors and chlorophyll meters to estimate grapevine nitrogen status and productivity. Precis Agric 16(1):77–98
Tilly N, Aasen H, Bareth G (2015) Fusion of plant height and vegetation indices for the estimation of barley biomass. Remote Sens 7(9):11449–11480
Toth C, Joźkow G (2016) Remote sensing platforms and sensors: a survey. ISPRS J Photogramm Remote Sens 115:22–36
Tremblay N, Wang Z, Ma B, Belec C, Vigneault P (2009) A comparison of crop data measured by two commercial sensors for variable-rate nitrogen application. Precis Agric 10(2):145–161
Uddling J, Gelang-Alfredsson J, Piikki K, Pleijel H (2007) Evaluating the relationship between leaf chlorophyll concentration and SPAD-502 chlorophyll meter readings. Photosynth Res 91(1):37–46
Ulissi V, Antonucci F, Benincasa P, Farneselli M, Tosti G, Guiducci M, Tei F, Costa C, Pallottino F, Pari L, Menesatti P (2011) Nitrogen concentration estimation in tomato leaves by VIS-NIR non-destructive spectroscopy. Sensors 11(6):6411–6424
Vrindts E, De Baerdemaeker J, Ramon H (2002) Weed detection using canopy reflection. Precis Agric 3(1):63–80
Waldchen J, Mader P (2018) Plant species identification using computer vision techniques: a systematic literature review. Arch Comput Methods Eng 25(2):507–543
Wang N, Zhang N, Dowell FE, Sun Y, Peterson DE (2001) Design of an optical weed sensor using plant spectral characteristics. Trans ASAE 44(2):409–419
Wang C, Zhao Q, Ma Y, Ren Y (2019) Crop identification of drone remote sensing based on convolutional neural network. Trans Chinese Soc Agric Machin 50(11):161–168
Wu Q, Sun H, Li M, Song Y, Zhang Y (2015) Research on maize multispectral image accurate segmentation and chlorophyll index estimation. Spectrosc Spectr Anal 35(1):178–183
Xiong D, Chen J, Yu T, Gao W, Ling X, Li Y, Peng S, Huang J (2015) SPAD-based leaf nitrogen estimation is impacted by environmental factors and crop leaf characteristics. Sci Rep 5(1):13389
Xu H, Ying Y, Fu X, Zhu S (2007) Near-infrared spectroscopy in detecting leaf miner damage on tomato leaf. Biosyst Eng 96(4):447–454
Yang C, Westbrook JK, Suh C, Martin DE, Hoffmann WC, Lan Y, Fritz BK, Goolsby JA (2014) An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing. Remote Sens 6(6):5257–5278
Yang G, Liu J, Zhao C, Li Z, Huang Y, Yu H, Xu B, Yang X, Zhu D, Zhang X, Zhang R, Feng H, Zhao X, Li Z, Li H, Yang H (2017) Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives. Front Plant Sci. https://doi.org/10.3389/fpls.2017.01111
Yue X, Ling K, Hong T, Gan M, Liu Y, Wang L (2018) Distribution model of chlorophyll content for Longan leaves based on hyperspectral imaging technology. Trans Chinese Soc Agric Machin 49(08):18–25
Zhang J, Pu R, Wang J, Huang W, Yuan L, Luo J (2012) Detecting powdery mildew of winter wheat using leaf level hyperspectral measurements. Comput Electron Agric 85:13–23
Zhang J, Yuan L, Pu R, Loraamm RW, Yang G, Wang J (2014) Comparison between wavelet spectral features and conventional spectral features in detecting yellow rust for winter wheat. Comput Electron Agric 100:79–87
Zhang Y, Zheng L, Li M, Deng X, Ji R (2015) Predicting apple sugar content based on spectral characteristics of apple tree leaf in different phenological phases. Comput Electron Agric 112:20–27
Zhang J, Li M, Sun Z, Liu H, Sun H, Yang W (2018) Chlorophyll content detection of field maize using RGB-NIR camera. IFAC-Papers OnLine 51(17):700–705
Zhang J, Huang Y, Pu R, Gonzalez-Moreno P, Yuan L, Wu K, Huang W (2019a) Monitoring plant diseases and pests through remote sensing technology: a review. Comput Electron Agric 165:104943
Zhang J, Liu X, Liang Y, Cao Q, Tian Y, Zhu Y, Cao W, Liu X (2019b) Using a portable active sensor to monitor growth parameters and predict grain yield of winter wheat. Sensors 19(5):1108
Zhao C, Jiang A, Huang W, Liu K, Liu L, Wang J (2007) Evaluation of variable-rate nitrogen recommendation of winter wheat based on SPAD chlorophyll meter measurement. N Z J Agric Res 50(5):735–741
Zhao Y, Chen X, Cui Z, Lobell DB (2015) Using satellite remote sensing to understand maize yield gaps in the North China plain. Field Crop Res 183:31–42
Zheng T, Liu N, Wu L, Li M, Sun H, Zhang Q, Wu J (2018) Estimation of chlorophyll content in potato leaves based on spectral red edge position. IFAC-Papers OnLine 51(17):602–606
Zhong Z, Sun H, Li M, Zhang F, Li X (2014) Development of a vehicle-mounted crop detection system. J Integr Agric 13(6):1284–1292
Zhong L, Hu L, Zhou H (2019) Deep learning based multi-temporal crop classification. Remote Sens Environ 221:430–443
Zhou L, Mu H, Ma H, Chen G (2019) Remote sensing estimation on yield of winter wheat in North China based on convolutional neural network. Trans Chinese Soc Agric Eng 35(15):119–128
Zhu Y, Cen H, El-manawy AI, Weng H, He Y (2019) A feature extraction method based on deep learning using hyperspectral imaging for the evaluation of oilseed grape canopy nitrogen content grades. 2019 ASABE annual international meeting, USA, Boston, Massachusetts July 7–10, 2019, paper number: 1900541
Zong Z, Zhao S, Liu G (2019) Coronal identification and centroid location of maize seedling stage. Trans Chinese Soc Agric Machin 50(S1):27–33
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Sun, H., Li, M., Zhang, Q. (2022). Crop Sensing in Precision Agriculture. In: Li, M., Yang, C., Zhang, Q. (eds) Soil and Crop Sensing for Precision Crop Production. Agriculture Automation and Control. Springer, Cham. https://doi.org/10.1007/978-3-030-70432-2_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-70432-2_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-70431-5
Online ISBN: 978-3-030-70432-2
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)