Keywords

8.1 Introduction

Precision agriculture (PA) or precision farming contributes to improving agronomic performance, saving resources, and protecting the environment. It is established as a management strategy that employs detailed, site-specific information to precisely manage production inputs based on variability to replace average inputs in the field. Besides 3S technology including the Global Navigation Satellite System (GNSS), Geographic Information Systems (GIS), and remote sensing (RS), many other technologies such as proximal imaging, spectroscopy, and wireless sensor network (WSN) are applied in PA (Jawad et al. 2017). They provide more efficient ways in crop management including real-time detection of crop growth, targeted analysis and decision-making of precision operation, and manual labor liberation by optimized tools.

In general, there are three main steps in precision crop management including soil and crop sensing, decision-making, and variable-rate application. One of the critical issues in precision agriculture is how to measure crop growth data noninvasively and efficiently. In the past decades, significant progress on optical instruments has been made in crop monitoring (Pallottino et al. 2019). A number of crop sensors and instruments have been developed and used to meet the requirements of PA and solve detecting problems in the field as shown in Fig. 8.1. They include RGB (red, green, blue) cameras, multispectral image sensors, hyper-spectrometers, unmanned aerial vehicles (UAVs), satellite remote sensors, thermographic imagers, and light detection and ranging (LiDAR). Most of them are developed based on the combination of spectroscopy, optical principles, and photodetectors to measure the reflected electromagnetic energy. In a typical way, data are represented with energy intensity by line plots and two-dimensional (2D) and three-dimensional (3D) images and even presented as a data cube.

Fig. 8.1
Model diagram of crop sensors and instruments used in Precision agriculture, which are, L A I sensor, Light detection and ranging ( LiDAR) sensor, R G B (red, green, blue) cameras, Spectrometer, Remote sensing, Thermograph, S P A D 502 plus, and Green Seeker. Arrows point to and emerge from a diagram of the Field crop.

Crop sensing instruments in precision agriculture (Disclaimer: Commercial products are referred to solely for the purpose of clarification and should not be construed as being endorsed by the authors or the institution with which the authors are affiliated)

In most cases, crop information is captured by instruments through line scanning or digital photography. Then the data are analyzed using various specialized software applications, such as spectroscopy analysis and digital image processing. Spectroscopy uses the interaction of electromagnetic waves with an object to perform an analysis related to crop nutrient or biomass. Digital imaging is a set of computational techniques for analyzing, enhancing, compressing, processing, and reconstructing crop images. Both methods are widely used in crop recognition and parameter estimation.

Researchers have developed some new instruments and extended their applications in many scenarios, including handheld detection, vehicle-mounted diagnosis, and remote sensing by UAVs or satellites, to build reliable prediction models of complex and uncertain phenomena in agriculture. In order to explain the application fundamental and potential of crop sensing instruments in precision agriculture, sensors and the specific models from spectral and image sensing technologies are examined. Applications of crop sensing involve the recognition of crops and weeds, estimation of nutrient and growth status, identification of disease and pests, and detection of special crop fruits.

8.2 Spectroscopy-Based Sensing Instruments for Crop Monitoring

8.2.1 Foundation of Spectral Sensing and Vegetation Indices in Crop Sensing

According to the range of electromagnetic radiation at specific nanometer (nm) wavelength, crop sensing is generally referred to as ultraviolet (UV, 200–400 nm), visible (Vis, 400–760 nm), near-infrared and shortwave infrared (NIR and SWIR, 760–2500 nm) (Toth and Joźkow 2016). The green plant typically displays low reflectance in the visible region, especially in the red band close to 650 nm due to strong absorbance by photosynthetic and accessory plant pigments. By contrast, the reflectance is usually high from the red-edge (680–760 nm) to the NIR (780–2500 nm) region because there is very little absorbance by subcellular particles or pigments and considerable scattering at mesophyll cell wall interfaces.

Since the changes in leaf pigments and biochemical components caused by nutrient stress or bio-infringement can influence the spectral characteristics of leaves, the spectral analysis can be used to monitor growing crops (Narvaez et al. 2017). Zhang et al. (2019a) reviewed that the spectral features used in plant monitoring was particularly affected by disease and pests. As shown in Table 8.1, spectral features of infected or damaged plants are highlighted including VIS-NIR spectral reflectance, fluorescence, and thermal features. Among those spectral features, band reflectance is the simplest form and can be transformed in different ways, such as spectral derivative, continuous removal transformation, and continuous wavelet transformation.

Table 8.1 Spectral features for monitoring plant diseases and pests

Spectral characteristics of vegetation can be analyzed based on sensitive reflectance and the vegetation index (VI). Sensitive wavelengths related to crop parameters are selected from the hyper-spectra to evaluate the vegetation vigor. A VI is a spectral transformation of two or more wavebands designed to enhance the contribution of vegetation properties and allows reliable spatial and temporal intercomparisons of terrestrial photosynthetic activity and canopy structural variations. Some VIs measured by specific bands are listed in Table 8.2, in which Ri is the reflectance at i nm or i band such as green, red, red edge, or NIR. Among these VIs, the Normalized Difference Vegetation Index (NDVI) is the most commonly used in crop monitoring. Large amounts of literature indicate that quantization parameters of spectra and vegetation indices are common methods in vegetation recognition, crop classification, and biomass estimation. Therefore, some sensors have been developed to measure VIs based on spectroscopy due to huge potentials of field applications.

Table 8.2 Vegetation indices measured by spectral reflectance

8.2.2 Spectral Sensing in Crop Monitoring

Spectral instruments with optical sensors are the fundamental tools to assess vegetation status. Three kinds of spectral instruments have been used including continuous spectrometers, vegetation index sensors, and imaging spectrometers. In general, spectrometers are used to measure continuous spectral reflectance of light over a specific portion of the electromagnetic spectrum, and vegetation index sensors measured by dual or multispectral bands. There are many commercially available products of portable sensors for crop monitoring. They are generally defined as a passive type or an active type according to lighting strategies during the measurements. For example, the Soil Plant Analysis Development (SPAD) meter is a compact device with active lighting and transmittance measurement to determine the amount of chlorophyll in plant leaves at 650 and 940 nm. The GreenSeeker (Trimble Agriculture, Sunnyvale, CA, USA) gets NDVI with the active lighting module. Some applications of hyperspectral sensing and vegetation index sensing in crop monitoring are listed in Table 8.3, which include detections of chlorophyll content, nitrogen content, and sugar content and estimations of growth stages and yields, even weed identification.

Table 8.3 Spectral sensing applications in crop monitoring

8.2.2.1 Hyper-Spectrometers for Crop Sensing

Hyper-spectrometers are the most frequently used instruments especially during theoretical or mechanism analysis in crop sensing. Most of the instruments used in chemical detection are designed based on UV spectroscopy, which work under the principle of Beer Lambert’s law. Zhang et al. (2015) used a UV-2450 spectrograph to measure the visible and NIR spectral reflectance of apple leaf samples within the 300–900 nm band, and such spectral information of apple tree leaves in different phenological phases could be used to predict fruit sugar content. According to the result of the two-dimensional correlation spectroscopic analysis on apple leaf reflectance with fruit sugar content as perturbation, it was observed that the autocorrelation peaks all appeared at the 530–570 nm and 700–720 nm wavebands in the synchronization spectrogram. The contribution proportion to fruit sugar content in different growth periods was investigated and then the support vector machine (SVM) model was established. The determination coefficient of the calibration model (Rc2) of the SVM model reached 0.89, and the determination coefficient of validation (Rv2) reached 0.88.

Compared with laboratory instruments, portable sensors are more flexible in the field. The devices could be selected by spectral range, resolution, usage requirements, and so on. For example, ASD FieldSpecHHFootnote 1 is a 512-element photodiode array spectroradiometer with a 325–1075 nm wavelength range. It uses a fixed concave holographic reflective grating that disperses the light onto a fixed photodiode array that has 512 individual detection points or “elements” in a line. Associated with each of these elements is a distinct signal whose magnitude is determined by the total integrated amount of light energy falling on that element. Then, each element is assigned to a position within 512 points. In this way, the analog signal is converted into digital signal. The instruments could be set to view traceable wavelength references such as emission source, reflectance standards, and the output of a triple monochromator. The output results are data points with known element-position and wavelength-channel coordinates.

Many of the current studies on crop monitoring involve portable spectrometers. The operation flow generally involves the control parameter setting, storage directory setting, dark noise measuring, reference calibration, sample detection, spectrum calculation, and display. The measured data are used to analyze and establish a specific model for crop monitoring purposes. Liu et al. (2018a, b) measured the spectral reflectance of maize canopy by using ASD FieldSpecHH to estimate the chlorophyll content. The data were processed following wavelet denoising and multivariate scatter correction (MSC) to reduce the noise influence. Then three spectral ranges were extracted by interval partial least squares (IPLS), including 525–549 nm, 675–749 nm, and 850–874 nm. The chlorophyll content estimation model was developed by using support vector regression (SVR). The calibration Rc2 of the model was 0.831, the RMSEC was 1.3852 mg/L, the validation Rv2 was 0.809, and the RMSEP was 0.8664 mg/L. Using the same spectrometer, Sun et al. (2019a) explored the optimizing spectral features to identify the growth stages of potato plants. In general, the canopy spectral reflectance varied with the growth stages in the bands of 400–500 nm, 530–640 nm, 740–880 nm, and 910–960 nm. The classification accuracies of SVM models were 100% in the training set and 94.59% in the testing set, respectively.

8.2.2.2 Portable Sensors Used in Crop Monitoring

According to the specific features and VIs used in crop monitoring, some sensors are designed as portable with only several sensitive bands to reduce redundant spectra. These specific sensors are generally developed based on the red and NIR bands. Besides red and NIR bands, a red-edge band located in the range of 700–760 nm is also included in the instruments to increase the variables in detecting models. An instrument can be designed to measure the transmitted or reflected light from leaves and crop canopy. The light source in the measurement can be natural light (sunlight) or artificial light source (lamp) defined as a passive or active lighting. The instrument with active lighting is more robust in field application to improve the performances under the limitation of weather or time windows. A few portable (handheld) instruments could be used to evaluate the content of chlorophyll or nitrogen, LAI, and yield using the calculated VIs. Farmers can use them in precision agriculture according to the application cases, leaf, or canopy measurement.

8.2.2.2.1 Portable Sensors for Leaf Measurement

Portable or handheld instruments for leaf measurement have advantages of compact size and lightweight. Most of them are designed based on the transmittance with the active light source. One of the widely used leaf chlorophyll meter is probably the Soil and Plant Analyzer Development (SPAD) chlorophyll meter, such as SPAD-502 Plus (Konica Minolta Inc., Japan).

Uddling et al. (2007) reported that the readings from the SPAD-502 Plus could not only provide the measurement of chlorophyll content, but also provide the information for estimating nitrogen status as well as photosynthetic pigment content. Schepers et al. (1992) compared the corn leaf disk N concentrations and SPAD 502 chlorophyll meter readings from N rate studies at the silking stage for a variety of hybrids. Data indicated that chlorophyll meter readings correlated well with leaf N concentrations for a given hybrid and location. Netto et al. (2005) established a correlation between the photosynthetic pigment content extracted in dimethylsulfoxide, the total nitrogen content, and the chlorophyll fluorescence variables with the SPAD-502 readings in Coffea canephora Pierre leaves. If the SPAD-502 readings were lower than 40, it showed impairment in the photosynthetic process. In the study, total N concentration increased linearly with SPAD-502 readings. Meanwhile, the relationship between the values obtained by the SPAD-502 and the chlorophyll fluorescence variables (F0, Fm, and Fv/Fm) proved that the maximum quantum efficiency of the photosystem II, indicated by the Fv/Fm ratio, started to fall at around 40.

The measured values by SPAD meters have also been used in the fertilization guiding. Zhao et al. (2007) proposed a study on the relationship between SPAD chlorophyll meter readings and nitrogen content in leaves in order to determine the amount of nitrogen fertilization. Field experiments were conducted in three wheat growth duration stages from 2003 to 2006. Grain yields and soil NO3-N contents were measured in all plots. The results indicated that the fertilizer application guided by the meter values reduced the spatial variability of wheat yield and had benefits of low soil residual NO3-N content and NO3-N leaching potential.

Gholizadeh et al. (2017) focused on the relationship between SPAD chlorophyll meter readings and N content in leaves during different growth stages. The research introduced the most suitable stage for the assessment of crop N and prediction of rice yield. Results implied that there was a better relationship between rice leaf N content (R2 = 0.93) and yield (R2 = 0.81), with SPAD readings at the panicle formation stage. Therefore, the SPAD-based evaluation of N status and prediction of rice yield is more reliable on this stage rather than at the booting stage.

Although SPAD readings have been widely used in the measurement of chlorophyll content, Xiong et al. (2015) indicated the relationship between chlorophyll content and leaf N content per leaf area, and the relationship between SPAD readings and leaf N content per leaf area varied widely among the species groups. A significant impact of light-dependent chloroplast movement on SPAD readings was observed under low leaf N supplementation in both rice and soybean but not under high N supplementation. Furthermore, the allocation of leaf N to chlorophyll was strongly influenced by short-term changes in growth light. It demonstrates that the relationship between SPAD readings and leaf N content per leaf area is profoundly affected by environmental factors and leaf features of crop species, which should be accounted for when using a chlorophyll meter to guide N management in agricultural systems.

8.2.2.2.2 Portable Sensors for Canopy Measurement

Instruments for canopy monitoring are generally designed to measure the reflected light related to typical VIs. Portable instruments, such as GreenSeeker, Crop Circle, and N-Sensor, are commonly used to get the NDVI in the field. For on-the-go applications, these sensors can also be mounted to vehicles to remotely sense plants while driving through a field.

According to the concept of the active crop canopy monitoring, an instrument emits a brief burst of red and infrared light and then measures the amount of each type of light that is reflected back from the plant. GreenSeeker sensors (Trimble Navigation Limited, Sunnyvale, CA, USA) are designed based on modulated red (650–670 nm) and NIR (755–785 nm) LEDs (light-emitting diode). Crop Circle devices (Holland Scientific Inc., Lincoln, Nebraska, USA) are equipped with multispectral active sensors. The Crop Circle ACS-430 incorporates three optical measure channels, so that the sensor simultaneously measures crop/soil reflectance at 670, 730, and 780 nm. Moreover, the Crop Circle ACS-470 has six bands (450, 550, 650, 670, 730, 760 nm) and three of these bands can be used at one time to measure the radiative transfer and biophysical characteristics of plant canopies. Yara N-Sensor (Yara International ASA, Germany) is different from the active optical sensors mentioned above. It has a xenon flashlamp, which provides high-intensity multispectral light, so that it can measure and record the crop light reflectance in a waveband between 450 and 900 nm (Munoz-Huerta et al. 2013).

Several studies were conducted to detect crops based on portable sensors mentioned above. Cao et al. (2012) found that GreenSeeker-NDVI was exponentially related to N uptake in winter wheat, whereas the correlation between N uptake and RVI was linear. Zhang et al. (2019b) intended to expand the applicability of GreenSeeker in monitoring the growth status and predicting the grain yield of winter wheat (Triticum aestivum L.). Four field experiments with multiple wheat cultivars and N treatments were conducted during 2013–2015 to obtain NDVI and RVI synchronized with four agronomic parameters: LAI, leaf dry matter (LDM), leaf nitrogen concentration (LNC), and leaf nitrogen accumulation (LNA). Duration models indicated that NDVI and RVI explained 80%, 68–70%, 10–12%, and 67–73% of the variability in LAI, LDM, LNC, and LNA, respectively. Considering the variation among different wheat cultivars, the newly normalized VIs rNDVI (NDVI vs. the NDVI for the highest N rate) and rRVI (RVI vs. the RVI for the highest N rate) were calculated to predict the relative grain yield (RY, the yield vs. the yield for the highest N rate). rNDVI and rRVI explained 77–85% of the variability in RY.

In order to determine which VIs calculated from the Crop Circle sensor can perform the best estimation of rice N status, Cao et al. (2013) compared six VIs based on the green (550 ± 20 nm), red-edge (730 ± 10 nm), and NIR (>760 nm) bands. The results indicated that using the Normalized Difference Red Edge (NDRE) to predict plant N uptake had the highest coefficient of determination (R2, 0.76) and the lowest root mean square error (RMSE, 17.00 kg N/ha). The second best-performing vegetation index was the Red-Edge Chlorophyll Index (CIRE), which performed similarly to NDRE. Crop Circle ACS-210 and ACS-430 (red at 630 nm, red edge at 730 nm, and NIR at 780 nm) were compared and different NDVI values were analyzed in each individual waveband (Taskos et al. 2015). The results demonstrated that ACS-430 and red-edge-based indices were more strongly correlated with leaf chlorophyll of vineyards.

Regarding the Yara N-Sensor (Yara International ASA, Germany), Singh et al. (2015) investigated the tractor-mounted N-Sensor to predict nitrogen (N) content for wheat crop under different nitrogen levels. It was observed that there was a strong correlation among sensor attributes (sensor value and sensor NDVI) and different N-levels. The Yara N-Sensor/FieldScan (Yara International ASA, Germany) was used to assess the status of N in spring wheat and corn (Zea mays L.) at specific growth stages (Tremblay et al. 2009). It was found that the Yara N-Sensor/FieldScan should be used before growth stage V5 in corn during the season if NDVI was used to derive crop N requirements. Yara N-Sensor/FieldScan can also record spectral information from wavebands other than red and NIR, and more VIs can be derived that might relate better to the nitrogen status than NDVI.

Besides the instruments introduced above, there are similar systems such as OptRx Crop meter (Holland Scientific, USA), CropSpec sensor (Topcon Positioning Systems, USA), and CCM-200 and CCM 300 (Edaphic Scientific, Australia). They are also widely used in nitrogen and chlorophyll measurements (Serrano et al. 2016; Sharabian et al. 2013). Published reports indicate that each sensor has its own sensitivity characteristics, and the wavelengths around 550, 650, 766, and 850 nm are mostly selected according to different applications (Tremblay et al. 2009; Cao et al. 2017; Taskos et al. 2015). Meanwhile, the algorithms should be proposed to establish estimation models, so that the modeling results could indicate the operation in the field management.

8.2.3 Development of Spectroscopy-Based Systems for Crop Detection

The current trend in crop sensing is to integrate compact sensors and detecting models. In this sense, certain studies have been conducted to develop new systems to provide support in field management.

8.2.3.1 Development of Hyperspectral Sensors for Crop Monitoring

In order to predict the nutrient content of winter wheat nondestructively in the field, an integrated system was developed based on an STS-VIS sensor (Cheng et al. 2017). The STS-VIS sensor from Ocean Optics Inc., USA, is a compact sensor for portable application. It is a grating-based device with an advanced CMOS (complementary metal-oxide-semiconductor) 1024-element detector array to measure wavelengths in 350–850 nm. The USB output makes secondary development possible to satisfy online detection, typically by the software integration of established models. As shown in Fig. 8.2, the hardware of the integrated spectrometer consists of three parts of the optical system, the data storage module, and the controller. The optical sensor with a fiber is used to measure the reflected light from the leaf or canopy of the field crop. The controller could be connected to the sensor through USB2.0 or a wireless network. The supporting software installed on the PC or mobile controller helps to control the signal communication and processing. The setting parameters include the integration time, sampling frequency, and average number due to the effects of the ambient light intensity and the sampling requirements.

Fig. 8.2
Flow chart of the structural components of an Integrated spectrometer. The 3 components are: Optical system, Data storage module, and Controller. The optical system and Data storage module are interlinked by U S B transmission. Data storage module and Controller are interlinked by Wi Fi.

Mechanical structure of the integrated spectrometer. (Cheng et al. 2017)

A software program was also developed to collect the spectral reflectance of winter wheat canopy in 350–820 nm. The calibration experiment was carried out to test the performance of the sensor by a gray calibration board with four different gray levels. The correlation coefficient between the sensor and ASD (Field Spec HandHeld2) showed that the average correlation coefficient was 0.94. Eight wavelengths, including 514, 527, 562, 572, 605, 705, 719, and 795 nm, were selected to detect the chlorophyll content using the random frog (RF) algorithm after spectral curve smoothing. The determination coefficient of the partial least squares (PLS) regression model was 0.69.

8.2.3.2 WSN-Based Sensors for Crop Monitoring

With the development of wireless sensor networks (WSNs), a novel system which contained one control unit and several optical sensor nodes for crop growth detection was developed by China Agricultural University (Zhong et al. 2014). Sensors, organized by ZigBee WSN, were designed to collect, amplify, and transmit the optical signals. A CS350 (Cilico Microelectronics Corp., Ltd., Xi’an, China) type of PDA (personal digital assistant) was selected as the coordinator of the whole wireless network to receive, display, and store all the data sent from different sensor nodes. Since wireless communication was applied, the PDA could be easily used, installed in the cab of the tractor, or hand-held by the operator.

Each sensor node was designed with four optical channels at the wavebands of 550, 650, 766, and 850 nm, respectively. Since the detection system used sunlight as the light source, besides the reflected light from crop canopy, the sunlight intensity was also measured as a reference as shown in Fig. 8.3. A full-function sensor node had to contain eight optical channels, the upward four for the sunlight measurement and the downward four for the reflected light measurement. A silicon photodiode was used to convert the light signal to current signal in each optical channel. A 4:1 time-sharing analog multiplex chip was applied to share the amplification unit and an OPA333 amplifier was chosen which had the properties of high precision, low quiescent current, and low power consumption. The weak signals were then amplified and transformed to voltage signals and subsequently read through A/D convertors in the microcontroller unit (MCU), which was a JN5139 wireless module (Jennic Co, UK). The measured data were wirelessly transmitted to the coordinator via an antenna.

Fig. 8.3
Photograph and schematic diagram of the structure of W S N based sensor. The photograph has labels, Downward and Upward. The schematic diagram has labels, Light channels, and an analog switch. This is connected to gates and the M C U before transmission.

Structure of the WSN-based sensor for crop monitoring. (Zhong et al. 2014)

Therefore, once started, the sensor was initialized and the data were collected automatically with a certain sampling frequency. By setting the address of analog switch, the data of each channel were repeated for ten times and then averaged. Sensors had different identification numbers, and the sampling frequency was adjustable according to different requirements.

In the field experiments, the optical sensors measured the spectral reflectance of the crop canopy with four channels at 550, 650, 766, and 850 nm separately. The transmission quality of the sensor nodes was evaluated at distances of 20, 40, 60, 80, and 100 m and the signals could be transmitted precisely without packet loss in all tests. Calibration experiments showed that the accuracy of the optical components was high enough for application. The results of the stationary field experiments showed that the detection system was capable of monitoring the spectral characteristics of the crop canopy. The correlation between chlorophyll content and NDVI was at an acceptable level, with R2 of 0.681–0.718. The system provides a support for crop growth detection and a theoretical basis for further research on chlorophyll content prediction in the field.

8.2.3.3 An Integrated Sensor Based on Spectroscopy and Imagery

Furthermore, in order to monitor crop information more efficiently, a multi-fusion sensor was developed based on the combination of spectroscopy and imagery technology, as shown in Fig. 8.4 (Long 2020). The sensor was designed to collect the spectral reflectance and images of the crop canopy. It consists of three parts including sensors, a data processing unit, and a data transmission port. The spectral reflectance collected by an AS7263 sensor (ams AG, Premstaetten, Austria) involved six bands in the red and NIR ranges (610, 680, 730, 760, 810, and 860 nm), each of which had 20 nm of full-width half-max detection. The RGB image was captured to estimate the canopy coverage to help determine the location during field measurement. The data could be sent to a mobile phone remotely through a Wi-Fi module.

Fig. 8.4
A model diagram of an Integrated sensor with the label Hardware has the following components: Battery, Imagery sensor, Spectral sensor, Data storage, and Data transmission. Wi Fi sends data to the mobile module.

Structure of the integrated sensor based on spectroscopy and imagery

The sensor application experiments were performed. The fusion data of spectral reflectance and images from the sensor were used to analyze the growth status of field corn with different levels of fertilizer. The adaptive boosting algorithms were used to model the chlorophyll content. The determination coefficient of the model was 0.859, which was higher than that just based on spectral data (0.829). The fusion of spectral reflectance and image data could improve the prediction accuracy of crop chlorophyll content. It provides a new tool for crop monitoring in the field.

8.3 Image Sensing for Crop Detection

8.3.1 Foundation of Crop Imaging and Feature Extraction

Optical imaging is one of the noninvasive methods for crop sensing. Similar to spectrometers, optical imaging uses the special properties of light and electromagnetic waves to obtain detailed images of leaves and plants, as well as canopy and even ecosystems. In a typical way, the data are represented with energy intensity in a line plot or 2D images. Recently, image sensing has resulted in many developments in agricultural information acquisition. A variety of imaging instruments are available such as monochrome and color digital cameras (RGB), depth and time-of-flight (ToF) cameras, multispectral and hyperspectral cameras, thermography, fluorescence sensors, and others (Yang et al. 2017; Li et al. 2014a, b).

New data sources and processing methods of 2D and 3D images and spectral data cubes significantly boom the research on crop recognition, plant positioning, and phenotype measurement. Lots of features can be extracted from images as shown in Table 8.4, including color features, texture presentation, shape and spatial description, Vis-NIR spectral features, fluorescence, and thermal parameters (Mavridou et al. 2019; Ali et al. 2017).

Table 8.4 Image features in crop monitoring

In the last two decades, extensive research has been reported for image feature extraction and objective analysis. High-level image visuals are represented in the form of feature vectors that consist of numerical values. Research shows that there is a significant gap between image feature representation and human visual understanding (Latif et al. 2019). Thus, the feature selection in imaging systems is dependent on the requirements of crop monitoring; meanwhile, feature representation is another task in research.

Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large amount of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping (Waldchen and Mader 2018). ML approaches can be deployed in identification, classification, and prediction, such as SVM, neural networks (NNs), kernel methods, and instance-based approaches (Singh et al. 2016). Recently, deep learning (DL), a subset of ML approaches, has emerged as a versatile tool to assimilate large amounts of heterogeneous data and provide reliable predictions of complex and uncertain phenomena (Liu et al. 2017). These tools are increasingly being used in extracting crop features and identifying symptoms of crop growth status (Singh et al. 2018).

8.3.2 Imaging Technologies Used in Crop Detection

Imaging technologies play an important role in crop sensing. The great majority of the sensors are designed based on either solid-state technology, such as CCD (charge-coupled device) and CMOS (complementary metal-oxide-semiconductor) chips used in optical imagers, or avalanche photodiodes, like InGaAs (indium gallium arsenide) and single-photon avalanche diode (Toth and Joźkow 2016). An appropriate equipment should be examined in order to satisfy the needs of each application. In general, the most important factors that need to be considered are the sensor resolution, frame rate, and price (Pajares et al. 2016).

Considering the diverse cameras that are available in the market, several images used are listed in Table 8.5. Imaging technologies used in near-ground crop detection can be divided into four types which are digital color imaging to capture RGB images, 3D imaging to measure depths or spatial distribution, and spectral and thermal imaging. A color image is simple and affordable, so that RGB images are extensively used in crop sensing tasks of recognizing weeds, measuring plants, and detecting diseases and pests in the field (Garcia et al. 2017; Yang et al. 2014; Jiang et al. 2018; Ferreira et al. 2019; Zong et al. 2019; Knoll et al. 2019).

Table 8.5 Several images used in near-ground crop detection

Although a stereovision system could measure 3D data, the imaging methods by ToF are popular due to the robust environmental influences, such as LiDAR and photonic mixer devices (PMD) (Knoll et al. 2016a). A typical LiDAR sensor emits pulsed light waves into the surrounding environment. These pulses bounce off surrounding objects and return to the sensor and then the time for each pulse to return to the sensor is measured. The sensor uses the time to calculate the distance between the sensor and the object. Repeating this process millions of times per second creates a precise, real-time 3D map of the environment. The LiDAR sensors are used in phenotype measurement such as height and biomass (Tilly et al. 2015). Moreover, some low-cost 3D cameras are also applied in crop sensing such as Kinect (Microsoft, USA) and Real Sense (Intel., USA). They provide flexible tools in weed identification and fruit recognition (Sa et al. 2016; Kang and Chen 2020).

Imaging spectrometers collect images as well as spectra from the observed crop. Nowadays, a wide range of imaging spectrometers have been used on different platforms including stationary or handheld near-ground platforms and unmanned aerial vehicle (UAV) platforms. Imaging spectral instruments have been widely used in crop detecting with crop classification, disease identification, nutrient estimation (chlorophyll, water, nitrogen content), and so on (Zhu et al. 2019; Yue et al. 2018; Huang et al. 2017; Liu et al. 2018a, b; Zheng et al. 2018). In addition, thermal sensors are also used in drought estimation because of close relationships among the temperature, water stress, and environment (Maimaitijiang et al. 2020).

8.3.3 Development of Imaging Systems for Crop Detection

The results of previous research studies have provided basic principles for the development of optical sensing to acquire the spectral information in the field. Spectroscopy analysis and image processing are applied as rapid, convenient, and nondestructive techniques for crop growth monitoring. The Research Center for Precision Agriculture at China Agricultural University (CAU) has developed three kinds of multispectral imagery systems for crop monitoring (Wu et al. 2015; Sun et al. 2019b; Liu et al. 2020). In general, each system includes a multispectral camera device and controlling software. The multispectral camera is designed with the capability to measure multispectral images of crop canopy in three visible bands (red [R], green [G], and blue [B]) and a NIR band. The software is developed to control the camera system. Furthermore, the estimating models of crop parameters should be embedded in the system. This way, it could provide an online device and method for crop sensing.

8.3.3.1 A Two-CCD-Based Imaging System for Crop Measurement

A two-CCD-based imaging system was designed for crop measurement, which included a multispectral image acquisition device, a communication protocol converter, and a controlling platform (Wu et al. 2015). A multispectral two-channel CCD camera (JAI Ltd., Denmark) was used, which included a splitter prism with two reflecting mirrors to split input light into visible and NIR bands. Two CCD sensors could obtain four images in three visible bands (400–700 nm, R, G, and B) and one NIR band (760–1000 nm) at the same time. The camera link communication protocol standard was adapted to output RGB and NIR images with 1024(h) × 768(v) active pixels per channel. The communication between the camera and computer was conducted by a QuadMVCL2GE converter (Beijing Microview Science and Technology Co., Ltd., China) to convert the output image from the camera link into the GigE Vision standard. The highest output bandwidth was 960 Mbps. In the research, a panel industrial control computer (PPC-3708, Beijing Weidatong Co., China) was used as the system platform. The main functions included a multispectral camera control module, an image acquisition module, and a multispectral image processing module. When the system was connected, it could work following image acquisition, data conversion, and image display and storage. The multispectral image could be displayed and stored in RAW, BMP, and JPG format.

An image processing model was developed with three main functions: image enhancement, image segmentation, and parameter calculation (Sun et al. 2013). The developed system was applied in the chlorophyll content estimation of tomato. Multispectral images were collected and the SPAD values of tomato leaves were measured. More than 80 pairs of RGB and NIR images were acquired in the experiment. They were first processed by the median filtering algorithm to eliminate the noise and then segmented from the background. Figure 8.5a, b shows a pair of RGB and NIR images, and the segmented results are shown in Fig. 8.5c, d, respectively. The average gray values of each image were calculated to get the VIs of tomato canopy. The correlation analysis results indicated that the highest correlation coefficient was 0.7514 between RVI and SPAD values.

Fig. 8.5
Four photographs of a tomato canopy used in image processing model. A pair each of original and segmented R G B and N I R images.

Multispectral images of tomato canopy (Sun et al. 2013). (a) Original RGB image (b) Original NIR image (c) Segmented RGB image (d) Segmented NIR image

8.3.3.2 A Portable Binocular Sensor for Crop Monitoring

The NDVI calculated based on spectral reflectance is proved as one of the important parameters to estimate crop growth parameters quickly and nondestructively. Thus, the measurement of the NDVI distribution is one of important research directions for sensor development (Sun et al. 2019b). Unlike the two-CCD-based imaging system which could acquire the RGB and NIR images synchronously, some low-cost binocular vision systems could also be used in the collection of RGB and NIR images. The biggest challenge in using these kinds of binocular vision systems is image matching, so that the NDVI distribution and dynamics of crops could be monitored with high accuracy.

In order to develop a portable multispectral imaging system for crop monitoring, an FM830-5 M device (Shanghai Percipio Information Technology Co., Ltd., China), which had an RGB camera and two NIR cameras, was used to acquire RGB and NIR images of corn. The RGB and one of the NIR cameras were used to develop a binocular sensor for crop monitoring. Images of RGB and NIR were processed following preprocessing, image matching, segmentation, and image reflectance correction. The flowchart is shown in Fig. 8.6.

Fig. 8.6
Model diagram and flow chart of a crop sensor using binocular stereo vision systems. The model diagram has the labels: P C, Portable Sensor, Fixing set, and data transfer U S B cable. The flow chart has the following modules: Light saturation removal, Image match, Image segmentation, N D V I calculation, and S P A D modeling. Each module points to the next, to generate 2 D distribution and visualization. Labels, N I R image and R G B image, connect model diagram to the flow chart.

Development of crop sensor using binocular stereo vision systems

The acquired images were calibrated and preprocessed. Firstly, the RGB image was preprocessed. The edge and texture of the RGB image were enhanced by Laplace transform. The light saturation removal (LSR) algorithm was used to improve the image quality. Secondly, the median filter was used to eliminate the salt and pepper noise of images.

In order to compare the performances of different image matching methods, 51 maize plants were collected synchronously by the binocular vision system at 90°, 54°, and 35°, respectively. Three algorithms, namely, SURF (Speeded-Up Robust Features), SIFT (Scale-Invariant Feature Transform), and ORB (Oriented Brief), were applied and discussed for RGB-NIR image matching. The optimal matching method was SURF, which was determined by matching time, PSNR (peak signal to noise ratio), MI (mutual information), and SSIM (structural similarity index).

The crops were segmented from the background by using the ExG (Extra Green) algorithm and maximum interclass variance algorithm (OTSU). The R, G, B, and NIR components of the segmented RGB images were extracted. Then, the NDVI of each pixel in the image was calculated, and the spatial distribution map of the crop VI was drawn. The SPAD values at pixel level were calculated. The regression model of SPAD values and NDVI showed that the determination coefficient was 0.619. The demonstration of the sensor application and results are shown in Fig. 8.7.

Fig. 8.7
A set of two images. 1, compare the performance of different image matching methods. 2, three-dimensional image of leaves and color indicators.

Demonstration of the sensor application and results (Sun et al. 2019b). (a) Image matching (b) NDVI mapping

8.3.3.3 A Portable Multispectral Sensor for Crop Measurement

A 25-wavelength spectral imaging sensor (mode: XIMEAI-5 × 5-CMOS, Shanghai Branch of IMEC Microelectronics Co., Ltd., China) was used to develop a multispectral system for crop measurement, as shown in Fig. 8.8a (Liu et al. 2020). The filter of this sensor was processed on the wafer of a commercial application CMOS image capture chip that has a mosaic layout. There was a specific spectral filter on each pixel, and 25 wavelengths were placed on the COMSIS-CMV2000 sensor with two million pixels. This sensor was able to obtain spectral information of the following 25 wavelengths: 666, 681, 706, 720, 732, 746, 759, 772, 784, 796, 816, 827, 837, 849, 859, 869, 887, 888, 902, 910, 920, 926, 935, 940, and 945 nm. The sensor had a field of view (FOV) of 50°. The image size of each wavelength was 409 pixels × 217 pixels, and the grayscale resolution was 10 bits.

Fig. 8.8
A set of two diagrams. 1, computer and the imaging sensor. 2, the user interface depicts potato chlorophyll content online detection system.

Spectral sensor and control software of the detection system (Liu et al. 2020). (a) Spectral sensor (b) Control software interface

In order to realize the real-time detection of SPAD values of potato plants in the field, a control software program was developed based on the Qt Creator 4.9.1 platform under a Windows operating environment. The user interface shown in Fig. 8.8b was designed based on the Qt Widgets application. The image processing functions were realized by calling the OpenCV libraries. The main functions of the software included the following: spectral image collection, exposure time adjustment, spectral image correction, SPAD value pseudo-color expression, SPAD value statistics, and image saving.

The spectral sensor and control software comprised the SPAD value real-time detection system. The reflectance of potato plants was extracted by the segmented mask images. The partial least squares (PLS) regression was employed to establish the SPAD value detection model based on sensitive variables selected using the uninformative variable elimination (UVE) algorithm. So the visualization distribution map of SPAD values was drawn by pseudo-color processing technology.

8.4 Remote Sensing Platforms for Crop Monitoring

8.4.1 Remote Sensing Instruments Used in Crop Monitoring

Unlike spectral sensors introduced above, remote sensing spectrometers usually operate in Earth observation, capturing images as well as spectra from the observed materials. Images in wavebands make it possible to locate and extract plants from the background by image processing and derive numerous VIs. Great efforts have been made over the past decades to produce high-quality data in remote sensing by developing a wide range of imaging spectrometers placed on aerial/satellite platforms (Paoletti et al. 2019). Compared with near-ground platforms such as UAVs and stationary or handheld near-ground devices, which focus on specific fields or plants in small areas (Wang et al. 2019; Han et al. 2019), aerial and satellite remote sensing related to Earth observation is suitable for large farmland and ecosystem monitoring.

Efforts have been made over the past decades to produce high-quality data. These instruments could be classified into multispectral or hyperspectral devices according to the numbers of bands. A multispectral image contains from several to about a dozen bands, while a hyperspectral image (HSI) contains hundreds to thousands of contiguous wavelengths (Mishra et al. 2017). Several systems, shown in Table 8.6, are mostly used in aerial and satellite remote sensing. Similar to the spectral images mentioned before, the features extracted from remote sensing data include color features, texture presentation, shape and spatial description, and Vis-NIR spectral features.

Table 8.6 Spectral imaging used in remote sensing

8.4.2 Application of Multispectral Remote Sensing

Traditional satellite sensors such as SPOT and Landsat have long been used in crop sensing. The SPOT Vegetation sensor was carried aboard SPOT 4 and 5 which were launched in 1998 and 2002, respectively. It had the capability of imaging the entire Earth each day with IFOV (1.15 km) (https://eos.com/landsat-5-tm/). SPOT Vegetation collected data in four spectral bands in 0.43–0.47 μm, 0.61–0.68 μm, 0.78–0.89 μm, and 1.58–1.75 μm (Cayrol et al. 2000).

Landsat Thematic Mapper (TM) was a multispectral scanning radiometer that was carried on board Landsat 4 and 5. The TM sensors had provided nearly continuous coverage from July 1982 to June 2013. A TM scene had an instantaneous field of view (IFOV) of 30 m × 30 m in bands of visible (0.45–0.52 μm, 0.52–0.60 μm, 0.63–0.69 μm), NIR (0.76–0.90 μm), and SWIR (2.08–2.35 μm), while the band of 10.41–12.5 μm has an IFOV of 120 m × 120 m on the ground. The Landsat Enhanced Thematic Mapper Plus (ETM+) was introduced with Landsat 7 (https://eos.com/landsat-7/) and was built by Raytheon SBRS (Santa Barbara Remote Sensing), Goleta, CA. Except the visible and NIR bands of TM data, ETM also scans the bands of SWIR (1.57–1.75 μm, 2.09–2.35 μm), thermal infrared (10.40–12.50 μm), and a panchromatic (PAN) (0.52–0.90 μm).

The Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) are instruments onboard the Landsat 8 satellite (https://eos.com/landsat-8/), in which OLI, built by the Ball Aerospace & Technologies Corporation, measures in the visible, NIR, and SWIR infrared portions of the spectrum. Therefore, Landsat 8 Instruments have nine spectral bands at 30-m spatial resolution including a PAN band: visible (0.43–0.45 μm, 0.450–0.51 μm, 0.53–0.59 μm), red (0.64–0.67 μm), NIR (0.85–0.88 μm), SWIR (1.57–1.65 μm, 2.11–2.29 μm), panchromatic (PAN) (0.50–0.68 μm), and cirrus (1.36–1.38 μm). It also has two thermal infrared sensors with bands of 10.6–11.19 μm and 11.5–12.51 μm at 100-m spatial resolution.

Using satellite remote sensing to understand maize yield gaps in the North China Plain with Quzhou County as an example, Zhao et al. (2015) used Landsat 5 TM, Landsat 7 ETM+, and SPOT 4 satellite data during the summer maize growing season from 2007 to 2013 with the exceptions of 2008 and 2011 when there was a lack of high-quality cloud-free images. In order to solve the spatial differences between SPOT 4 and Landsat data, Landsat images were resampled to 20-m resolution using the nearest neighbor method. Results indicate that remote sensing can provide reasonably reliable estimates of maize yields in this region. In addition, the majority of yield gap is dominated by transient factors, and shrinking this gap may require high-quality forecasts to make informed optimal management decisions.

Satellite remote sensing also has been used in crop classification and disease monitoring. Zhong et al. (2019) used data from Landsat 7 ETM+ and Landsat 8 OLI at 3- m resolution to classify summer crops. Two types of deep learning models were designed using Landsat Enhanced Vegetation Index (EVI) time series. Three widely used classifiers were also tested for comparison, including a gradient boosting machine called XGBoost, Random Forest, and SVM. Among non-deep-learning classifiers, XGBoost achieved the best result with 84.17% accuracy and an F1 score of 0.69. The model employs EVI time series by examining shapes at various scales in a hierarchical manner. Ma et al. (2019) discriminated winter wheat powdery mildew and aphid infestations during a co-epidemic outbreak of the disease and the insect pest in northeast China based on temporal Landsat 8 imagery integrated with crop growth and environmental parameters.

Using satellite monitoring, the system notifies its users of critical changes in vegetation, sends real-time weather risk alerts, and automates the prioritization process within field work planning tasks. As a result, all of the abovementioned capabilities make it possible not to miss important points in the treatment of fields and to respond in a timely manner to any changes. So far, researchers have implemented agricultural projects for monitoring fields, classifying crops, identifying growth and stress status, and forecasting crop yields (Zhou et al. 2019; Shen et al. 2019).

8.4.3 Application of Hyperspectral Remote Sensing

Advances in sensing and computer technologies have achieved great improvement in hyperspectral image data acquisition. A number of HSI data missions for Earth Observation have been launched and provide new tools for satellite remote sensing, such as the NASA Hyperspectral Infrared Imager (HyspIRI), the Environmental Mapping and Analysis Program (EnMAP), and the Precursore IperSpettrale della Missione Applicativa (PRISMA) program (Paoletti et al. 2019). Meanwhile, several instruments are used in capturing great volumes of HSI data based on airborne remote sensing. As shown in Table 8.3, some of best-known spectrometers are available for crop sensing.

The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), developed by the Jet Propulsion Laboratory (JPL) (Pasadena, California, USA), was a hyperspectral imaging sensor that delivered calibrated images of upwelling spectral radiance in 224 contiguous spectral bands with wavelengths from 400 to 2500 nm (http://aviris.jpl.nasa.gov/). Moreover, the Airborne Visible Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) sensor samples 430 contiguous bands between 380 nm and 2510 nm at approximately 5-nm spectral resolution.

Nagasubramanian et al. (2017) identified the disease named charcoal rot in soybean crops using AVIRIS hyperspectral data. In the range of 383–1032 nm, they developed a 3D convolutional neural network (CNN) model for soybean charcoal rot disease identification. The classification accuracy was 95.73% and the infected class F1 score was 0.87. Salas et al. (2020) derived a set of narrow−/broadband indices from the AVIRIS-NG imagery to represent spectral variations and identify target classes and their distribution patterns. The results showed that the maximum entropy (MaxEnt) and generalized linear model (GLM) had strong discriminatory image classification abilities with area under the curve (AUC) values ranging between 0.75 and 0.93 for MaxEnt and between 0.73 and 0.92 for GLM. It was also found that the Photochemical Reflectance Index (PRI) and Moment Distance Ratio Right/Left (MDRRL) were important predictors for target classes such as wheat, legumes, and eggplant.

The Compact Airborne Spectrographic Imager 1500 (CASI 1500), designed by ITRES Research Ltd. (Calgary, Alberta, Canada), is a system that acquires data in 380–1050 nm and splits light into 288 discrete bands. It was used to obtain images over a field that had been set up to study the effects of various nitrogen application rates and weed control on corn (Goel et al. 2003). The results indicated that the reflectance of corn was significantly influenced (α = 0.05) at certain wavelengths by the presence of weeds, the nitrogen rates, and their interaction. Differences in response due to nitrogen stress were most evident at 498 nm and in the band at 671 nm.

In addition, the HyMap scanner, built by Integrated Spectronics Pty Ltd. of Sydney, Australia, has four spectrometers in the interval of 450–2450 nm excluding the two major atmospheric water absorption windows. The research was conducted on estimating foliage nitrogen concentration from HyMap data using continuum-removal analysis (Huang et al. 2004). It identified the known nitrogen absorption features. The coefficient of determination increased from 0.65, using the standard derivative analysis, to 0.85 with the continuum-removal analysis. Mewes et al. (2011) indicted the potential to detect wheat disease induced by a pathogen infection. With the original spectral resolution of HyMap, the highest classification accuracy could be obtained by using 13 spectral bands with a Kappa coefficient of 0.59.

In summary, imaging spectrometers are of increasing importance for agricultural applications, particularly for the support of crop sensing that increases the productivity of crop stands (Zhou et al. 2019; Shen et al. 2019). However, to define an optimal sensor-based system or a data product designed for crop detection, it is necessary to know which spectral wavelengths are representative and which spectral resolution is needed. The methods of data processing also face the challenges from different instruments and requirements. Hence, research may involve data fusion and modelling supported by machine learning and even deep learning.

8.5 Precision Crop Management Based on Sensing Instruments

Spectroscopy and imaging sensors have been widely used to support precision agriculture by providing information for crop management (Zhang et al. 2018). It presents an automated solution of object recognition and detection in crop production, combined with technologies of machine vision and machine learning algorithms as well as deep learning systems (Gomes and Leta 2012; Kamilaris and Prenafeta-Boldú 2018). More and more agricultural robots have been developed based on crop sensing instruments and processing methods. They have been used in specific tasks that are traditionally performed manually in which manual methods have the disadvantages of being tedious and error-prone. Some recent advancements of crop sensors are applied in precision management in the field including variable sprayers for fertilizers and weed control and field-based crop phenotyping (Patricio and Rieder 2018).

8.5.1 Applications of Spectroscopy-Based Crop Sensors

8.5.1.1 Classification of Weeds and Damage Caused by Disease and Pests

Since reflectance of crops, weeds, and soil differs in the visual and NIR wavelengths, there is potential to distinguish them by spectral reflectance at different wavelengths. Vrindts et al. (2002) measured canopy reflectance of sugar beet, maize, and weeds with a line spectrograph (480–820 nm). Four wavelengths were selected to separate the sugar beet and weed plants including 572.7, 676.1, 801.4, and 814.6 nm. The overall classification accuracy was over 90%, while it had not shown good capability to classify maize and weeds with only 15% accuracy. Shirzadifar et al. (2018) selected bands around 1078, 1435, 1490, and 1615 nm to identify weeds of kochia, water hemp, and lamb’s-quarters.

In order to design an optical weed sensor, sensitive wavelengths within the visible and NIR bands (496, 546, 614, 676, and 752 nm) were selected based on the spectral differences between stems and leaves of various crops and weeds (Wang et al. 2001). The partial least-squares (PLS) calibration model was established by the combination of these wavelengths and their VIs. The designed instrument with the embedded model could identify wheat, bare soil, and weeds with classification rates of 100%, 100%, and 71.6%, respectively, for the training data set when the weed density was above 0.02 plants/cm2. Sui et al. (2008) developed a ground-based weed mapping system to measure weed intensity and distribution in a cotton field. It was used to directly output the canopy coverage and intensity ratio by connecting with a WeedSeeker sensor. The changes in leaf pigments and biochemical components caused by fungi infection or pest damage can influence the spectral characteristics of leaves, so that the spectral differences between healthy and damaged leaves can be used to identify the plant health status. Various VIs are used in monitoring plant disease and pests such as NDVI, GNDVI, and OSAVI (Zhang et al. 2012, 2019a). Based on the fluorescence spectra, some studies applied the ratio of fluorescence (e.g., F686/F740) amplitude at fluorescence peaks to achieve presymptomatic detection for some pathogens (Bürling et al. 2012). Parameters associated with the saturation pulse method could be used to evaluate the changes of affected pigments, such as the maximum quantum efficiency of photosystem II (PSII) primary photochemistry (Fv/Fm), the maximum efficiency of PSII photochemistry in light-adapted material (Fv’/Fm′), and non-photochemical quenching (NPQ). Besides VIS-NIR and fluorescence spectroscopy, thermal observation provides an indicator to find the temperature changes of stressed symptoms from plant canopy.

The sensitive features are important for detection of diseases or pests. Naidu et al. (2009) discussed the spectral characteristics of grape infected by grapevine leafroll disease (GLD). The spectral differences between healthy and infected leaves are located around the green (near 550 nm), shortwave NIR (near 900 nm), and NIR (near 1600 and 2200 nm) bands. The classification models were built based on the sensitive wavelengths (531, 570, 752 nm, etc.) and VIs (NDVI, RVSI, PRI, etc.). Moreover, the results showed that compared with the linear regression result of 0.72 from RVSI, the accuracy increased to 0.78 when RVSI was combined with the reflectance in the blue band (470–490 nm) and 526 nm. In the same study, the classification accuracy was 0.75 by the variables that combined PRI with bands of 765–830, 970, and 684 nm.

Similarly, Annamalai and Lee (2004) investigated the spectral signatures of immature green citrus fruit and leaves for the purpose of developing a spectrally based fruit identification and early yield mapping system. Diffuse reflectance of fruit and leaf samples were measured in the range of 400–2500 nm, and two important wavelengths at 815 and 1190 nm were selected. A ratio of these two wavelengths was used to distinguish immature green fruit from leaves. Other researchers studying leaf miner damage, bacterial spots, and yellow rust of crop leaves had examined the sensitivity of spectral responses and characteristics and established identification models by partial least squares (PLS) regression, stepwise multiple linear regression (SMLR), support SVM, and so on (Moshou et al. 2014). Recently, more and more statistical analysis and machine learning modeling methods are applied. Deep learning, a part of machine learning, has also been applied to select the features or to build an end-to-end architecture for discriminant analysis.

8.5.1.2 Monitoring of Nutrient Content and Biomass Status

Crop growth status is generally evaluated by the nutrient content and biomass level, in which the contents of chlorophyll, nitrogen, and water are related to the nutrient level, and the biomass is generally estimated by the leaf area index (LAI) referred to a unit area or volume of habitat. The estimation of crop growth parameters using spectroscopy helps to guide the management of fertilizer and irrigation and predict the yield in the field.

The chlorophyll measurement has always been the priority of considerable research because chlorophyll is the organic molecule of plant leaves for photosynthesis and highly relates with leaf nitrogen in the 400–700 nm spectral range (Ulissi et al. 2011). Using the same spectral features shown in Table 8.1, a large number of researchers estimate the chlorophyll content by sensitive wavelengths, VIs, red-edge location, and others. Ciganda et al. (2009) constructed a red-edge chlorophyll index with red-edge (720–730 nm) and NIR (770–800 nm) spectral reflectance. Chen et al. (2010) proposed a new spectral indicator named Double-peak Canopy Nitrogen Index (DCNI) which was used for maize nitrogen estimation. Schlemmer et al. (2013) indicated that the chlorophyll content could be accurately retrieved using green and red-edge chlorophyll indices by the bands located in the NIR (780–800 nm) and either the green (540–560 nm) or red edge (730–750 nm). Rossini et al. (2012) estimated chlorophyll using a suite of VIs and found a high correlation of over 0.8 between leaf chlorophyll content and narrowband spectral indices. Sonobe et al. (2018) showed that shading treatment for a crop made the reflectance lower near the wavelengths of 550 and 740 nm. Two methods, machine learning algorithms and the inversion of a radiative transfer model, were evaluated using measurements from tea leaves. Overall, the kernel-based extreme learning machine had the highest performance with a root mean square error (RMSE) of 3.04 ± 0.52 μg cm−2 and the ratios of performance to deviation (RPD) from 3.38 to 5.92 for the test set.

The molecular absorption of hydrogen-containing groups (O-H, N-H, C-H) provides a potential to measure the moisture content nondestructively (Cheng et al. 2011). Although water absorption has been explored in the infrared region with spectral centers at 970, 1200, 1440, and 1950 nm (Palmer and Williams 1974), a series of researchers proposed different wavelengths due to the influences of species, phenology, environment stress, and so on. Dejonge et al. (2016) established a diagnosis model of corn water content to guide the field irrigation using the NDVI, OSAVI, and GNDVI. Among these VIs, the NDVI showed the best performance with highest R2, slope almost equal to 1. So the vegetation ratios of water-stressed and non-stressed NDVI was set as an irrigation trigger with the threshold value of 0.93.

In addition, spectroscopy methods can be used to invert some biomass parameters and indirectly calculate LAI. Except the NDVI, other spectral indices have also been presented in recent research. Ray et al. (2006) found that VIs, NDVI, and SAVI (Soil-Adjusted Vegetation Index), calculated in the bands of 780–680 nm, produced the highest correlation coefficients with LAI. Han et al. (2016) built a model to predict the LAI of apple tree canopy by comparing SVM and random forest (RF) algorithms. Some VIs used in the RF regression model were in accordance with LAI in the full fruit period including GNDVI, NDIVI, RVI, and GRVI. Besides the VIS and NIR regions, Neinavaz et al. (2016) conducted some research in the thermal infrared region (TIR) and found that the canopy emissivity spectra increased with rising LAI.

In particular, the value of LAI could also be measured by an optical sensor, named LAI-2000 Plant Canopy Analyzer (LI-COR Biosciences, USA). It works by digital photography to show how canopy gap fraction measurements can be overestimated if measurements are taken when foliage is brightly lit (Han et al. 2016).

According to the studies mentioned above, the main methods include data preprocessing, sensitive parameter selection, and estimation modelling. The capabilities and performances of spectroscopy were explored for crop sensing. However, the methods used in data processing and the results were different, indicating that the sensors and algorithms used might influence the application significantly. Researchers will face further challenges on sensor integration and data fusion.

8.5.2 Applications of Imaging-Based Crop Sensors

8.5.2.1 Application of Ground-Based Imaging Instruments

8.5.2.1.1 Classification of Crops and Weeds

Focusing on the recognition of field weeds by different imaging sensors, Knoll et al. (2016a), used a time-of-flight (TOF) sensor, CamCube 3, to create depth images with a resolution of 204 × 204 pixels. In addition, more sensors were equipped on a field robot named Bonirob, including a Bispectral JAI camera, a Nikon D5300 camera, a Kinect II, and a laser scanner (Knoll et al. 2016b, c). The Bispectral JAI camera (JAI Ltd., Denmark) uses one lens for two cameras (RGB camera and IR camera) with 1296 × 966 pixels. The Nikon D5300 captures RGB images with a resolution of 6000 × 4000 pixels. Moreover, the Kinect II records a color image with the size of 1920 × 1080 pixels and an infrared image of 512 × 424 pixels. Meanwhile, the ToF technology allows a depth image of 512 × 424 pixels. In the research, the best performances were provided by the JAI camera and the Nikon camera. As a result, two VI determination methods based on RGB images were proposed by extracting color features of RGB and HSV (hue, saturation, value) (Knoll et al. 2016b).

However, the disturbances of results are the influences of, for example, weather, the various stages of growth, the large number of different weeds, and the different soil conditions. In order to eliminate these influences, a self-learning convolutional neural network was used for weed recognition in the field. This deep-learning approach achieved accuracy of over 98% (Knoll et al. 2018). Similarly, a classification model of weeds in organic carrot was proposed by using a convolutional neural network (CNN) to help in weed management (Knoll et al. 2019). Several proposed methods also indicated that deep learning could help to extract high-level features from images to improve the classification accuracy (Asad and Bais 2019; Peng et al. 2019).

8.5.2.1.2 Identification of Specialty Fruits

Harvesting of specialty fruits such as apples, citrus, cherries, and pears is highly labor intensive and is becoming less sustainable with increasing cost and decreasing availability of a skilled labor force (Gongal et al. 2015).

In order to help harvesting and yield prediction of specialty fruits, a digital SLR camera (EOS Rebel T2i, Canon Inc., Japan) with an 18–55 mm lens was used to collect the RGB images of field blueberry with 3648 × 2736 pixels. Li et al. (2014a, b) selected three color components, red (R), blue (B), and hue (H), to separate fruits of four maturity stages from background through different classifiers. The performances were discussed among the results of the K-nearest neighbor (KNN), naïve Bayesian classification (NBC), and supervised K-means clustering classifier (SK-means). In this work, the KNN classifier yielded the highest classification accuracy (85–98%) from the validation set.

In the immature green citrus fruit detection, Gan et al. (2018) built an imaging system to provide valuable information for yield estimation at earlier stages. The system consisted of two color cameras (USB 3.0, The Imaging Source, Charlotte, NC, USA) and a thermal camera (A655sc, FLIR, Wilsonville, OR, USA). Images from all three cameras had the same spatial resolution, 640 × 480 pixels, and very similar diagonal field of views of about 30°. A new Color-Thermal Combined Probability (CTCP) algorithm was created to effectively fuse information from the color and thermal images to classify potential image regions into fruit and non-fruit classes. The results present that the fusion of the color and thermal images effectively improved the accuracy of immature green citrus fruit detection. For the same aim, Okamoto and Lee (2009) used a hyperspectral camera of 369–1042 nm to acquire hyperspectral images of green fruits of three different citrus varieties (tangelo, Valencia, and Hamlin). Spatial image processing steps (noise reduction filtering, labeling, and area thresholding) were applied. The results of pixel identification tests showed that the detection success rates were 70–85%, depending on citrus varieties.

8.5.2.1.3 Measurement of Crop Growth Status

Three-dimensional cameras have been used to obtain the depth or position information. The Kinect camera has a normal webcam and a depth sensor which can provide RGB-D image. The depth sensor consisted of an infrared laser projector combined with a monochrome CMOS sensor that could detect the range of 0.8–4.0 m. Sa et al. (2016) used a Kinect camera to capture RGB and NIR images. A Faster Region-based CNN (Faster R-CNN) model was established to detect sweet peppers, which took into account both precision and recall performances improving from 0.807 to 0.838. Kang and Chen (2020) used a RealSense D-435 camera to collect RGB and depth images for apple detection in the orchard. From the experiment results, DaSNet-v2 with ResNet-101 achieved 0.868, 0.88, and 0.873 on recall and precision of detection and accuracy of instance segmentation on fruits, respectively. In addition, it reached 0.794 on the accuracy of branch segmentation.

Although the measurement of plants is traditionally based on RGB images, the information of plant appearance is more accurately presented in 3D space, especially for geometry and topology. So the 3D imaging instruments are increasingly used in the crop phenotyping. LiDAR or laser sensors have been used to measure plant height and biomass because they present good adaptation to illumination and provide considerable data. LiDAR was adopted to measure the height and biomass of rice, oilseed rape, winter rye, winter wheat, and grassland (Tilly et al.,2015).

In order to estimate the nutrient content, an AD-130 GE bispectral camera (JAI Ltd., Denmark) was also used to capture multispectral images of RGB and NIR. Fifteen image parameters were extracted including the average gray values of images, the VIs (NDVI, NDGI, RVI, DVI), and image texture parameters (energy, moment of inertia, correlation, entropy, etc.). An SVM model was estimated to provide support for corn nutritional diagnosis and fertilization management decisions.

In order to evaluate the nitrogen content in oilseed rape (Brassica napus L.), Zhu et al. (2019) collected spectral images in 900–1700 nm wavebands using a hyperspectral camera, ImSpectorN17E (Spectral Imaging Ltd., Oulu, Finland). A fast nitrogen content grade classification method for oilseed rape canopy was established by employing a deep learning algorithm named stacked auto-encoders (SAEs). In this study, the SAE algorithm was introduced for the data dimensional reduction and feature extraction from hyperspectral images, and then the multiple classification models were applied for the feature testing and validation within the feature data under different camera angles with different feature units. Results showed that the best accuracy was presented by data captured under the 25° angle.

Hyper SIS (Zolix Instruments Co., Ltd., Beijing, China) is a hyperspectral camera to measure in the 369–988 nm band with a spectral resolution of 1.2 nm. It was used to detect the nitrogen detection of longan plants (Yue et al. 2018). The initial features were extracted using the principle component analysis (PCA) to identity a number of potential characteristic wavelengths (483, 518, 625, 631, 642, and 675 nm). Then the texture based on the gray-level co-occurrence matrix (GLCM) was extracted from those images. Combined with the state-of-the-art deep learning technology, a distribution model of chlorophyll content for longan leaves based on convolution neural networks (CNNs) and deep neural networks (DNNs) was proposed. As a result, the R2 of the calibration and validation set were 0.84 and 0.82, respectively.

In the detection of rice panicle blast disease, Huang et al. (2017) measured images in the band of 400–1000 nm by using a Gaia Field-F-V10 (Spectral Imaging Ltd., Finland) spectrometer. A deep convolutional neural network model, Google Net, was used to learn the representation of hyperspectral image data. The proposed method achieved a classification accuracy of 92.0%. The same hyperspectral camera has also been used in the nutrient monitoring of water or chlorophyll content (Liu et al. 2018a, b; Zheng et al. 2018).

8.5.2.2 Application of UAV-Based Imaging Instruments

With the development of remote sensing technology, the advantages of UAVs acquiring farmland images are fast and convenient. Furthermore, the scope of acquisition is gradually becoming an important means and research hotspot for farmland information acquisition (Yang et al. 2014, 2017).

Different types of spectroscopic and image sensors for UAV have been developed, such as digital color sensors and multispectral/hyperspectral imaging sensors, further extending UAV-based remote sensing to various applications (Lu et al. 2019).

8.5.2.2.1 Crop Classification

Based on digital color cameras, Yang et al. (2014) designed a multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame CMOS sensor with 5616 × 3744 pixels. One camera captures normal color images, while the other is modified to obtain NIR images. Images are stored in 14-bit RAW and 8-bit JPEG files in CompactFlash cards. The system has been practically applied in estimating crop canopy cover, detecting cotton root rot, and mapping henbit and giant reed infestations.

In order to classify different crops, Ferreira et al. (2019) used a Phantom DJI 3 Professional drone (DJI Technology Co. Ltd., China) to collect RGB images with 4000 × 3000 pixels to train a model and achieved 97% accuracy in the discrimination of grass and broadleaf. Based on this goal, Wang et al. (2019) equipped a multispectral camera on a composite wing UAV to collect images of cotton, corn, and squash. A Micro MCA12 Snap (Tetracam, CA, USA) obtained images at 12 bands of 470, 515, 550, 610, 656, 710, 760, 800, 830, 860, 900, and 950 nm, with 1280 × 1024 pixels in each band. A CNN network was designed to extract features and classify crops. Compared with the SVM based on radial basis kernel function and the backpropagation neural network, the optimized CNN had the best effect and the highest classification accuracy of 97.75%.

8.5.2.2.2 Crop Detection

Due to the requirement of nutrient estimation, Qiao et al. (2019) estimated the chlorophyll content of maize. RGB images with a resolution of 7360 × 4912 pixels were collected by using ILCE-7R (Sony Corporation, Japan) equipped on a DJI M600 UAV platform. The parameters related to the color and texture features in the images were extracted after the canopy segmentation to reduce influences from the background. The established model had a determination coefficient of 0.76. The distribution map of chlorophyll content in field maize canopy was drawn based on a pseudo-color technique. It provided a tool to visually distinguish the field road and canopy area, showing the difference in chlorophyll distribution of the plot.

Potgieter et al. (2017) conducted the assessment of seasonal leaf area dynamics of sorghum breeding lines by using multispectral imaging from an UAV. A RedEdge™ narrowband multispectral camera (MicaSense Inc., USA) capturing five bands at specific nanometer (nm) wavelength peaks was fitted to the UAV platform. The bands captured were blue (B: 475 nm center wavelength, 20 nm bandwidth), green (G: 560 nm, 20 nm), red (R: 668 nm, 10 nm), red edge (RE: 717 nm, 10 nm), and near-infrared (NIR: 840 nm, 40 nm). The horizontal field of view was 47.2 degrees with a 5.5-mm focal length producing an image resolution of 1280 × 960 pixels. It was found that the good correlations between each VI (NDVI and EVI) and each growth parameter, such as plant number per plot, canopy cover, and LAI both during the vegetative growth phase (pre-anthesis) and at maximum canopy cover shortly after anthesis. The NDRE, which is used to estimate leaf chlorophyll content, was also the most useful in characterizing the leaf area dynamics/senescence patterns of contrasting genotypes.

Cen et al. (2019) discussed the use of a lightweight UAV with dual image-frame snapshot cameras to estimate aboveground biomass (AGB) and panicle biomass (PB) of rice at different growth stages with different nitrogen (N) treatments. An RGB camera (NEX-7 camera, Sony Corporation, Japan) with a spatial resolution of 6000 × 4000 pixels and a snapshot multispectral camera (CMV2 K CMOS, IMEC, Chatsworth, Leuven, Belgium) with a spatial resolution of 409 × 216 pixels coupled with a three-axis gimbal were mounted on the UAV. The multispectral camera contains 25 wavelengths in the spectral region of 600–1000 nm (679, 693, 719, 732, 745, 758, 771, 784, 796, 808, 827, 839, 84, 860, 871, 880, 889, 898, 915, 922, 931, 937, 944, 951, and 956 nm). It was found that the canopy height extracted from the crop surface model exhibited a high correlation with the ground-measured canopy height, and several VIs were highly correlated with AGB.

These applications show that imaging instruments are being widely used on-board UAVs for collecting spectral and spatial information that allows the generation of maps to indicate the aspects of the plant state. Due to the availability of NIR wavelengths in multispectral images, spectral images have also become an indispensable tool for evaluating the physiological- and biochemical-related parameters of plants, such as LAI, vegetation fraction, nitrogen (N) and chlorophyll status, net photosynthesis, and biomass.

8.5.3 Variable-Rate Fertilizer Management Based on Crop Sensors

8.5.3.1 Variable-Rate Fertilizer Mapping Based on Imaging Instruments

In order to further extend the functions of the crop growth detector, a WSN-based detection system was proposed to measure crop spectral characteristics on the go and in real time as shown in Fig. 8.9. The controller was an industrial personal computer (IPC) with an attached ZigBee wireless communication module (JN5139 module). As the coordinator of the whole wireless network, it was used to establish the wireless network, waiting for sensor nodes to join in, and receiving, displaying, and storing all the data from different sensor nodes.

Fig. 8.9
A model structure of the vehicle-mounted crop detection system, having an industrial personal computer (I P C) with an attached Zig Bee wireless communication module.

Structure of the vehicle-mounted crop detection system

The measuring unit consisted of several optical sensors, and each optical sensor was used as a sensor node in this WSN. Each sensor node consisted of an optical part and a circuit part. The optical part contained eight optical channels at four wavebands. Since the detection system used sunlight as a light source, besides the reflected light from crop canopy, the sunlight intensity should also be measured as a reference. Therefore, two solutions were put forward:

  1. 1.

    A full-function sensor node had to contain eight optical channels, upward four for the sunlight and downward four for the reflected light.

  2. 2.

    As shown in Fig. 8.9, one sensor node was selected to measure the sunlight as the type I sensor, and other sensor nodes were used to measure the reflected light as the type II sensors.

As discussed above, the independence of the sensor (type I) was selected to measure the sunlight, and then the whole network shared the sunlight data. Under the premise of measurement precision, this type of design greatly reduced the cost of the system. Thus, sensors and the controller can set up a communication network in many ways. The networking mode between handheld and vehicle-mounted systems can be transformed into each other. The transmission distances can be up to hundreds of meters, which realized the real-time, continuous measurements of crops in the field. Furthermore, it increased the flexibility of the detector installation.

The new system increased the optical channels and was realized to measure the crop spectral characteristics on the go and in real time after being installed on an on-board mechanical structure (Zhong et al. 2014). Referring to the field test in Shaanxi Province, China, the distribution of the chlorophyll content of wheat detected by the new system is shown in Fig. 8.10a (Sun et al. 2015). In this way, it provides the automatic mapping of comprehensive growth status in the field. Combined with the fertilizer decision strategy such as the yield prediction method, the fertilizer recommendation map could also be used as an output as shown in Fig. 8.10b.

Fig. 8.10
A set of two images depicts the user interface. 1, automatic and manual mapping, importing data, generating the map, and selecting index. 2, data input, mapping of nitrogen fertilizer, parameters setting, and calculation option.

Variable-rate fertilizer mapping based on imaging instruments (a) Distribution of chlorophyll content of wheat (Sun et al. 2015). (b) Fertilizer recommendation mapping

8.5.3.2 Variable-Rate Fertilizer Control Based on Crop Sensing

Variable-rate fertilization technology improves the operational efficiency and utilization rate of a fertilizer and accelerates the sustainable development of modern agriculture to promote high-yield, superior-quality production while ensuring sufficient environmental protection. The crop sensors discussed in this chapter show the great potential to control the fertilizer rate in the field. Therefore, lots of variable-rate fertilizer applicators or sprayers are developed based on those sensors.

Commercial products, such as GreenSeeker products (Trimble Navigation Limited, Sunnyvale, CA, USA), Crop Circle devices (Holland Scientific Inc., Lincoln, Nebraska, USA) and Yara N-Sensors (Yara International ASA, Germany), promote solutions for variable-rate fertilization. However, the models of crop estimation and fertilizer decision are fixed in such systems. Hence, it might limit the applications of specific requirements such as crop diversities or regions.

In order to provide a more flexible system for precision fertilization, the multi-fusion sensor which was developed based on the combination of spectroscopy and imagery technology was applied in a fertilizer sprayer by China Agricultural University (Sun et al. 2018). The sensor was designed to measure the spectral reflectance in the red and NIR ranges, such as 610, 680, 730, 760, 810, and 860 nm, each with 20 nm of full-width half-max detection. More than ten kinds of VIs could be calculated by these data for crop monitoring. It means that the sensor could provide more flexible and modifiable models for different requirements of crop estimation. The RGB image was captured to estimate the canopy coverage so as to help determine the location during field measurement. The transmission method had been modified from the Wi-Fi to the CAN-bus, which has the advantages of long data-transmitting distance, fast speed, reliable transmit, and low cost. The sensing system is shown in Fig. 8.11a. A GPS model helps to record the detecting location, one of the sensors is used to calibrate the changes of sunlight, and crop sensors transmit data to the IPC by CAN-bus.

Fig. 8.11
A set of two images. 1, sensing system depicts online crop sensors, sunlight calibration, G P S, I P C, and data transformation by C A N bus. 2, fertilizer sprayer, crop sensors, and control software.

Variable-rate fertilizer control based on crop sensing (a) Sensing system used in a fertilizer sprayer (b) Variable-fertilizer sprayer

Generally, as shown in Fig. 8.11b during the fertilization process, the NDVI values of the crop canopy are acquired in real time by crop sensors. These values are transmitted to the vehicle-mounted IPC terminal through the CAN-bus cable. A variable-rate fertilization expert decision system preset into the IPC is run based on the model to generate optimal fertilizer rate in real time.

In this chapter, sensing principles and applied sensors based on spectroscopy and imagery are reviewed. Some developed sensors have been introduced and demonstrated to show the frontier research in this area. Numerous researchers in the cited literature have documented the practical applications of these sensors in many scenarios, including handheld detection, vehicle-mounted diagnosis, and remote sensing by UAVs or satellites, to build reliable prediction models of complex and uncertain phenomena in agriculture. With the integration of variable-rate technology, more and more precision management measures can be taken based on crop sensing methods. Recently, more and more new sensors and machine learning methods are applied in crop monitoring. These applications show the new trends for crop sensing. Smart crop sensors with artificial intelligence (AI) processors or deep learning models should emerge soon to improve the sensing accuracy or broaden the applications in the future.