Abstract
Cotton (Gossypium hirsutum L.) is an important cash crop and primary materials for clothing, fine paper, animal feed, and oil industries. Cotton production is affected by a combination effect of crop varieties, environment, and management. Precision agriculture technology has shown great potential to improve cotton production with sufficient high-resolution spatiotemporal data of soil, environment, and cotton development from seedling to harvest. The advances in unmanned aerial vehicles (UAVs), computer vision, and remote and proximal sensing technologies make it possible to scan large-scale field efficiently and quantify crop development. The big data analytics enabled by artificial intelligence (AI) have significantly increased the capacity in processing and analyzing complex data to quantify the interactions of environment and management on crop growth and yield. This chapter aims to summarize UAV applications in cotton production, focusing on field scouting and decision making, such as stand count, growth monitoring, and yield prediction, under different soil, weather conditions, and irrigation management. Meanwhile, the potentials and challenges of using UAV technologies in cotton production are also discussed.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
3.1 Introduction
3.1.1 Precision Agriculture Technology in Agricultural Production
Cotton (Gossypium hirsutum L.) is an important cash crop that provides approximately 35% of the total fibers for textile industry, including clothing and fine paper [1, 2]. Cotton seeds are also critical sources for feed and oil industries because of their rich oil (18 - 24%) and protein (20–40%) [1, 2]. Cotton plays an important role in human daily life and livestock industry, and is grown in 17 states and as a major crop in 14 states of the United States of America [3]. There is a great need to increase cotton production due to the increasing global population of more than nine billion in 2050. However, there are adverse conditions for improving agricultural production, such as decreasing arable land due to urbanization, declining soil quality (e.g., soil erosion, salinization, and nutrient reduction) due to long-term cultivation, and insufficient freshwater for crop irrigation [4]. It is time to improve cotton production using emerging technologies to reduce the inputs of natural resources and impacts on environments.
Cotton production is determined by the combined effects of genotype, environment, and management (G × E × M) [5]. To understand the G × E × M interaction, acquiring high-resolution data of crop, water, and other environments is needed using emerging remote and proximal sensing technologies. Studies have shown that timely crop monitoring and accurate yield estimation are important in making optimal decisions for field management, increasing cotton production, and reducing the negative impacts on environments. For example, the information of cotton plants and the environment obtained from different growing stages can be used to quantify the environmental impacts and nutrient deficiencies [6] on cotton development and yield. Precision agriculture (PA) technology has been used as a promising field management strategy to improve crop production and management efficiency based on site-specific information. The key factors for the success of PA include fast and accurate data acquisition of crops, soil, and environments in a large field. It also needs timely data processing and analysis pipelines to translate sensor data to executive data for field management decisions [7,8,9].
Sensor systems are the key components for the implementation of PA for agricultural production. Various sensors have been used to quantify crop development and environmental variations [10, 11], which provide necessary information to develop decision-making tools for field management. Some examples include in-field weather stations that continuously record environmental conditions, e.g., air temperature, humidity, solar radiation, and precipitation. Proximal sensors (e.g., soil apparent electrical conductivity (ECa) sensor and soil moisture sensor) are used to estimate soil texture, organic matter content [12], and soil moisture content at different depths. Infrared thermometers (IRTs) are used to measure crop canopy temperature to determine cotton water stress and schedule irrigation [13]. In addition, remote sensing systems based on satellite, airplane, unmanned aerial vehicles (UAVs), and ground vehicles are equipped with different sensors (e.g., optical sensors) for crop monitoring. The above site-specific information has been used to improve the efficiency of field management and crop production.
3.1.2 UAV-Based Remote Sensing (RS) for Crop Monitoring
Unmanned aerial vehicle (UAV)-based RS is currently widely used for crop monitoring due to the advantages of flexibility in sensor selection and data collection time when compared to satellite-based and ground-based sensing platforms. The UAV-based RS systems are usually equipped with multiple cameras and a global navigation satellite system (GNSS) to collect georeferenced imagery data of crops and soil in a high-throughput manner [14]. These systems are widely used to scout crops and collect site-specific information to make accurate decisions of crop management [15, 16]. Typical cameras used in UAV-based RS systems include visible red–green–blue (RGB) cameras, multispectral cameras, hyperspectral cameras, and infrared (IR) thermal cameras [14, 17]. Imagery data are processed using advanced machine learning (ML) to quantify crop characteristics in architecture, physiology, and chemical compositions [18], which are associated with plant health conditions and responses to the variation of environment and management.
The visible RGB cameras are the most ready-to-use sensors that are less expensive and in higher resolution comparing to multispectral and thermal cameras. The RGB cameras consist of three optical sensors with spectral bands of red (typical 550–650 nm), green (typical 470–600 nm), and blue (typical 420–530 nm) to produce digital images [19]. They have been used to acquire color information of different plant organs (e.g., leaf, stem, flower, and cotton boll) that can be used to characterize plants, quantify their health conditions and responses to biotic and abiotic stresses. The RGB images are also used to build point cloud data of plants to extract three-dimensional (3D) information, such as plant height and 3D architecture. As shown in Table 3.1, RGB images are used to calculate different image features that are used to quantify cotton development and yield. For example, plant height (PH), canopy cover (CC), greenness described as a* and triangular greenness index (TGI) are used for the assessment of cotton stand count [14, 20]. Moreover, the cotton fiber index (CFI) derived from RGB images has been used as a useful index for cotton yield estimation [14, 21].
Despite the advantages of using RGB cameras, they are still limited by only being able to acquire information of plants in the spectral range of 400 to 700 nm wavelength. Research has found that some spectral bands in near-infrared range (800–2,500 nm) are more sensitive to the variations of plants due to biotic and abiotic stresses. Multispectral and hyperspectral cameras have been used in many studies to acquire high spatial resolution spectral information from the plants. Multispectral cameras usually consist of optical sensors with less than ten independent spectral bands, and hyperspectral cameras consist of up to 300 spectral bands. The image features extracted from these spectral cameras include spectral information of each band and their combination, which have shown great potential to quantify plant heath conditions and estimate yield. Some examples (as summarized in Table 3.1) show that different vegetation indices (VIs) based on the combination of multiple spectral bands are able to monitor plant health status and estimate cotton yield [14, 33, 40].
In addition, thermal cameras have been used to quantify the energy of infrared (IR) radiation of an object with a temperature above absolute zero (-273 °C). Infrared thermal cameras capture the long-wave IR radiation (7.5–14 μm) emitted from crops and convert such radiation to electrical signals [41]. Plant temperature measurement has been primarily used to study plant water relations (e.g., stomatal conductance) since a major determinant of leaf temperature is the rate of evaporation or transpiration from the leaf [41]. Canopy temperature is highly correlated with canopy water stress and is used to capture plant responses to biotic and abiotic stresses [38]. Canopy temperature and the calculated crop water stress index (CWSI, Table 3.1) are found to be related to leaf water content [42,43,44] and show great potential for yield prediction [14].
3.1.3 UAV Imagery Data Processing Pipeline
Generally, the UAV imagery data collected are in the format of interval snapshots (still images) or videos. A large number of images may be collected for a large field. Figure 3.1 illustrates one of the typical data processing pipelines for UAV images, including image pre-processing, image processing, and analysis. The pre-processing steps consist of generating geo-referenced orthomosaic images and digital surface models (DSMs) [16] from the collected hundreds and thousands of sequential images or videos. The processes are usually conducted using customized or commercial UAV image stitching software such as Agisoft PhotoScan (Agisoft LLC, St. Petersburg, Russia) and Pix4D (Pix4D S.A., Prilly, Switzerland). Geo-referenced orthomosaic images and DSMs are further processed using advanced image processing techniques to remove background, segment plants, and extract image features. The image processing can be performed using commercial or open-source software such as Matlab (The MathWorks, Inc., Natick, MA, USA) and OpenCV (https://opencv.org/). The final step, image analysis, uses advanced ML models to translate the original images or extracted image features to useful information that is important for crop management. Some example applications include seedling assessment and stand count, water stress identification, and yield estimation. Some widely used commercial and open-source software for this step include Matlab, RStudio (RStudio, Boston, MA, USA) and PyTorch (https://pytorch.org/).
3.2 UAV Systems in Cotton Production
3.2.1 Field Management for Cotton Production
Cotton production includes several key growth stages, as shown in Fig. 3.2, including seed emergence, seedling, squaring, blooming (flowering), open boll, and harvest, that need different management strategies. The life span of cotton is about 150 – 180 days in the United States [3]. Germination and emergence usually take 5–15 days depending on the weather condition, soil temperature and moisture [45,46,47]. Some of the major factors that influence cotton growth and yield include planting rate, planting depth, row spacing, and seed placement [48,49,50]. Early stage field management in crop emergence and seedling establishment includes emergence and stand count assessment [50, 51] and weed control [3]. Pests, such as spider mites, aphids, whitefly and lygus bugs, are the primary targets to be managed at the stage after the first square [50, 52]. In addition, irrigation management is also an important practice in cotton production during square to bloom growth stages to improve yield [3, 13, 53,54,55]. Lastly, harvesting at the right time can potentially reduce crop yield loss.
Based on the general field management for cotton production mentioned perviously, the following sections will discuss the applications of UAV-based RS systems in making decisions on these field management. The sections are arranged based on different growth stages: (1) cotton emergence evaluations, (2) full-season cotton growth and health monitoring, and (3) yield estimation. The contents consist of the progress and advances of UAV applications in cotton production and the potentials and challenges of adopting UAVs in large-scale cotton production. Figure 3.3 summarizes the general approaches (sensors and image features used) and associated applications of using UAV-based RS systems in cotton production.
3.2.2 Cotton Emergence Assessment
Crop emergence is an important agronomic factor for field management in early stages. Cotton emergence is usually assessed based on plant population, stand count, uniformity, seedling size, and etc. [50, 51]. Accurate and timely assessment of cotton stand count and seedling size helps farmers to make important management decisions, such as replanting, to reduce the yield loss due to missing plants [52]. Meanwhile, accurate cotton emergence information can be used to quantify the impact of soil and environments on crop emergence [56, 57], which may help farmers make optimal decisions on seed placement and planting. Conventionally, cotton emergence is assessed through visual observation (manual counts) in a small number of sampling sites [58], which is time-consuming, labor-intensive, and not suited to cover a large production field. Hence, UAV-based imaging technology has been used to assess some cotton emergence parameters such as stand count, canopy size, seedling number, and uniformity based on plant spacing as indicated in Table 3.2.
The primary challenge of assessing the cotton emergence using UAV imaging systems is the small seedling size at the early stage. For instance, their diameters were 2.4 and 3.5 cm on 11 and 17 DAP, respectively [61]. Hence, sufficient image resolution or GSD is needed and high-resolution RGB cameras have been widely used compared to other cameras (Table 3.1). With a low flight height of 10 to 20 m, images taken from the RGB cameras can achieve a better GSD ranging from 0.25 to 0.9 cm pixel−1. However, using RGB cameras give another challenge on image segmentation to remove the background and detect the small seedlings. The large portion of soil background in the images caused the color distortion of seedlings due to the strong reflectance of soil [62] and the color contrast between crops and soil was weak [17]. Previous research indicated that RGB images were potentially affected by sunlight conditions and suggested that using multispectral images with near-infrared (NIR) spectral bands could be more efficient for crop seedling segmentation [63]. For example, Feng et al. [22] assessed cotton emergence using a UAV-based hyperspectral imager for quantifying cotton plant density and uniformity. However, the accuracy for plant density could only achieve 0.61, which was lower than that of the studies using RGB cameras. The advantage of using hyperspectral imagers is the higher spectral resolution, which can be useful for other stress studies during the growing season, but they are limited by their higher cost and a large amount of data with proper data processing and analysis needed [64] as compared to simple RGB cameras.
Deep learning (DL) models are widely used to directly locate and detect each cotton seedlings [60, 61] and extract information such as average stand count and canopy size [22] from the UAV images. The DL models used included Resnet18, YOLOv3, MobileNet, and CenterNet and achieved the highest R2 of 0.98 [22, 60, 61]. One example of the DL model used by Feng, Zhou, et al. [14] is illustrated in Fig. 3.4, where the DL model was able to extract “hidden” information (a subtle difference) using multiple convolution and pooling layers to distinguish seedlings and background information.
The commonly used image processing pipeline (Fig. 3.1) requires stitching collected images using commercial software. The image stitching process may take a long time when thousands of images are collected from high-resolution cameras and at low flight height in field-scale studies. This issue may restrict their usage for PA applications to conduct timely management practices at the right time with accurate data. To improve the efficiency, Feng, Zhou [22] proposed a novel pipeline, as illustrated in Fig. 3.5, to process and analyze each UAV image directly to avoid image stitching procedure. The pipeline can process one image (20 M pixels) in about 2 s to extract the information of emergence, which provides an alternative method to assess cotton emergence in near real-time. It is possible to implement the image processing pipeline to an edge computing system that is integrated with a UAV system for real-time and on-site data processing and decision making. To conclude, DL and real-time image processing will enhance the UAV-based imaging systems in the cotton emergence assessment in commercial farms in the future.
3.2.3 Cotton Growth Monitoring Using UAV-Based RS
Crop growth and production are complex and determined by many factors, including crop genotypes (varieties), environments (e.g., weather, soil, microclimate, and location), and agronomic management strategies [65]. During the cotton growing season, several field management practices, including irrigation and chemicals (fertilization, pesticides, and herbicides) applications, need to be optimized to achieve optimal production. High-resolution site-specific crop information is needed to determine crop stresses for variable rate applications.
Irrigation is one of the most important management practices in cotton production since cotton’s growth and yield are sensitive to water deficit [66, 67]. The optimal irrigation schedule can be made according to the cotton water demand to replace the water loss due to evapotranspiration (ET), determined by cotton varieties, weather conditions, soil texture, and irrigation treatment [68, 69]. Infrared thermal cameras mounted on UAV had been widely used to detect cotton water stress by first estimating the canopy temperature and computing stress indices such as CWSI in Table 3.1 [44, 70,71,72]. Stomatal conductance has been used as one of the ground truth data to describe crop water stress response. Studies have shown a moderate to high correlation between stomatal conductance and CWSI computed using canopy temperature from UAV-based thermal cameras with r = -0.48 [71] and R2 = 0.66 to 0.91 [44, 70]. One of the challenges in using thermal imagery from thermal cameras is the background (i.e., soil and residue) removal since the pixel numbers from the thermal images are based on temperature values and not color information. Hence, some other image segmentation methods have been used, such as edge detection algorithms (Canny, Prewitt, and Roberts methods) [44]. Some thermal cameras have visible cameras attached, which could be used for the image segmentation based on visible images and creating a mask and co-registered with the thermal images to remove background [73]. These procedures are required to ensure that only crop canopy temperature is used in the crop water stress indices to improve the estimation accuracy.
Besides thermal cameras, some studies also demonstrated the usage of VIs computed from RGB and multispectral images in determining the cotton water stress and plant water content [44, 70, 74]. For instance, moderate to high correlations (significant r = 0.3 to 0.9 regardless of negative or positive correlation; R2 = 0.6 to 0.9) were found between different VIs from multispectral cameras with plant water contents at different parts (leaves, petioles, buds and bolls, stalks) [74]. When relating the different VIs to stomatal conductance, mixed results were found with low to high R2 (for example, NDVI ranged from 0.01 to 0.89) and their R2s were mostly lower as compared to thermal images [44, 70]. However, RGB and multispectral cameras are frequently used for other purposes such as weed and pest detection and mapping for UAV-based variable-rate spraying during the growing seasons as delineated in the following.
Using UAV-based RS systems is a promising way for weed, pest, nutrient and disease control in commercial cotton fields [3, 50, 52, 75, 76] and harvest aids [77, 78]. The key to efficient UAV-based variable-rate spraying is to identify the right regions of the cotton field and the right volume and time of spraying [78,79,80,81]. Optimal crop managment decisions are usually made according to cotton plant growth status that can be quantified by VIs [82, 83], canopy coverage [80], boll opening rate [77,78,79, 84], and pressure of weed [17, 85, 86], disease [87,88,89,90,91] and pest [81] of cotton fields. Advanced image processing methods and ML models (such as support vector machine, k-means classification, maximum likelihood, and random forest) are still the key for weed and disease identification and management decision making [81, 85, 87, 88]. Considering the complexity in field background of the images, DL methods are also promising image processing tools for cotton leaves disease and stress identification [91, 92].
Overall, UAV-based imagery from RGB, multispectral, and thermal cameras has demonstrated their great potential for in-season cotton field management. Some applicaitons include irrigation scheduling through plant water content and stress detection, variable rate chemical applications through mapping weeds and pests, and UAV variable-rate spraying. Most of these studies only used a single factor when developing the estimation models, such as crop response to irrigation [93, 94] or fertilizer [95, 96]. Few studies have integrated the interaction effects of environment and management due to the lack of long-term data and efficient tools for developing reliable models [97]. With the advance in the UAV-based RS, high-resolution imagery data can quickly quantify characteristics of crops and soil, which brings challenges in efficiently processing and analyzing the big data of different variety, resolution, and data structures. Hence, future studies will be focused on more integration of different UAV-based cameras with more advanced ML and DL modellings to explore the relationships between soil properties, weather conditions, and cotton growth variation [98].
3.2.4 Cotton Yield Estimation
Accurate cotton yield estimation could help farmers make better decisions on management, such as harvest, transportation, and storage [99]. Cotton yield estimation can also provide needed information to understand the interaction effect of environment and management on crop development and yield, which is important in developing precision management strategies for cotton production [14]. The UAV-based imagery has been used to predict cotton yield using different image features, including plant height, canopy cover, canopy temperature, VIs (NDVI and GNDVI), and open cotton bolls (Table 3.1) [14, 21, 100,101,102,103]. The UAV imagery data are typically collected on the critical stages, such as flowering and/or boll opening [14]. Research showed that the combination of the image features could estimate yield with R2 > 0.80 [14].
Several studies used high-resolution RGB images to detect and count open cotton bolls [101, 102]. Yeom et al. (2018) used image processing techniques (e.g. Otsu automatic thresholding) to extract the cotton bolls from the UAV images collected in the boll opening stage, achieving an accuracy of R2 close to 0.6. Xu et al. (2020) used two fully convolutional DL networks to extract the cotton bolls, obtaining a better accuracy of R2 = 0.8.
Current studies in yield estimation based on UAV imagery usually used data within a single year, i.e., using data from the same year for both model training and validation [104,105,106,107]. Future studies should focus on developing models for predicting crop yield of the coming years using historical data, which may have higher values for farmers to make proper decisions in advance to maximize their profit. In recent years, ML and DL techniques have been used to predict crop yield of future years using historical data of environment, management, and crop production [108,109,110,111]. Integration of the soil, weather, and crop data with different spatial and temporal resolutions utilizing advanced data analytic methods has the potential to improve the performance in quantifying field-scale crop growth and yield in a future year. One such example is a study conducted by Feng et al. (2021) that used an improved recurrent neural network (RNN) [112] model known as Gated Recurrent Units (GRU) [113] to analyze sets of sequential data, including soil, weather, and time-series imagery, for predicting cotton yield. As shown in Fig. 3.6, the architecture of the GRU comprises a reset gate and an update gate to control how much information is needed to be forgotten and memorized through the sequence.
The architecture of the yield prediction model based on the GRU is shown in Fig. 3.7, which included four layers, i.e., soil features (S_CNN) and weather features (W_CNN) pre-processed using CNN filters, GRU layers for NDVI prediction (‘GRU’ in Fig. 3.7), and the fully connected layers for yield prediction (‘FCL_2’ in Fig. 3.7). The input parameters of the GRU network included eleven soil features processed by the S_CNN and weather data from May 1 to October 29, 2019, processed by the W_CNN. The initial hidden-input vector (‘Init’ in Fig. 3.7) of the GRU was set as zeros and the corresponding output was assumed to be the NDVI in July (the GRU output of its first loop). The hidden outputs of the July GRU were passed to the next loop of the GRU unit (i.e., the GRU in August). The processed weather data from July 31 to September 3 were also passed to the August GRU, which had an output assumed to be the NDVI in August. The NDVI in July was used to replace the ‘Init’ to input to the August GRU. Similar procedures then defined the GRU processing units for September NDVI and for yield. The GRU unit was a three-month loop processing unit that each loop received the weather data from current month and the NDVI images from the previous month. The GRU was used to predict the NDVI spatial distribution of the current month based on the weather of the current month and the NDVI of the previous month.
The study indicates that the yield prediction model based on DL is able to integrate data of soil, weather, imagery, and yield that have different resolutions and data structures. The yield prediction model shows an improved performance compared to the models without the full integration of related environmental factors, which could explain 68–84% of yield measured with the yield monitor (ground truth data), with the prediction errors of MAE = 247 kg ha−1 (8.9%) - 345 kg ha−1 (12.3%) at different years. In summary, the study indicates the potential of predicting the cotton yield of a future year based on soil texture, weather conditions, and UAV imagery. Future research could investigate more leading methods in integrating multiple source data of multiple years to estimate field-scale yield in the following years.
3.3 Summary
This chapter introduced the UAV applications in cotton production, from stand count, growth monitoring to yield prediction. UAV-based RS systems, equipped with RGB, multispectral, hyperspectral and thermal cameras, provide a low-cost and efficient tool for monitoring cotton growth at different stages. The collected information will then be used to improve crop management, such as seedling replanting, weed and disease control, and irrigation. Future research would focus on integrating multiple source data (i.e., environmental, ground, and UAV data) to develop near real-time tools using advanced ML and DL modellings for more effective and accurate cotton emergence, growth, and yield estimation for commercial applications.
References
Johnson RM et al (2002) Variability in cotton fiber yield, fiber quality, and soil properties in a southeastern coastal plain. Agron J 94(6):1305–1316
Zhang J et al (2014) Genetics, breeding, and marker-assisted selection for Verticillium wilt resistance in cotton. Crop Sci 54(4):1289–1303
National Cotton Council of America (2021) Cotton: from field to fabric. [cited 2021 06/27]. https://www.cotton.org/pubs/cottoncounts/fieldtofabric/upload/Cotton-From-Field-to-Fabric-129k-PDF.pdf
Pennock D, McKenzie N, Montanarella L (2015) Status of the world's soil resources. Technical Summary FAO, Rome, Italy
Hatfield JL, Walthall CL (2015) Meeting global food needs: realizing the potential via genetics× environment× management interactions. Agron J 107(4):1215–1226
Holman FH et al (2016) High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing. Remote Sens 8(12):1031
Zhang C, Kovacs JM (2012) The application of small unmanned aerial systems for precision agriculture: a review. Precision Agric 13(6):693–712
Lindblom J et al (2017) Promoting sustainable intensification in precision agriculture: review of decision support systems development and strategies. Precision Agric 18(3):309–331
Maes WH, Steppe K (2019) Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci 24(2):152–164
Hunt ER et al (2013) A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Int J Appl Earth Obs Geoinf 21:103–112
Bendig J et al (2015) Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int J Appl Earth Obs Geoinf 39:79–87
Sudduth KA et al (2003) Comparison of electromagnetic induction and direct sensing of soil electrical conductivity. Agron J 95(3):472–482
Vories E et al (2020) Comparison of precision and conventional irrigation management of cotton and impact of soil texture. Prec Agricul, 1–18
Feng A et al (2020) Yield estimation in cotton using UAV-based multi-sensor imagery. Biosys Eng 193:101–114
Turner D et al (2014) Spatial co-registration of ultra-high resolution visible, multispectral and thermal images acquired with a micro-UAV over Antarctic moss beds. Remote Sens 6(5):4003–4024
Yang G et al (2017) Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives. Front Plant Sci 8:1111
Feng A et al (2020) Evaluation of cotton emergence using UAV-based narrow-band spectral imagery with customized image alignment and stitching algorithms. Remote Sens 12(11):1764
Xie C, Yang C (2020) A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput Elect Agricult 178:105731
Kolláth Z et al (2020) Introducing the dark sky unit for multi-spectral measurement of the night sky quality with commercial digital cameras. J Quant Spect Rad Transfer, 253
Chen Y, Mei X, Liu J (2015) Cotton growth monitoring and yield estimation based on assimilation of remote sensing data and crop growth model. In Geoinformatics, 2015 23rd International Conference on. IEEE. pp 1–4
Huang Y et al (2016) Cotton yield estimation using very high-resolution digital images acquired with a low-cost small unmanned aerial vehicle. Trans ASABE 59(6):1563–1574
Feng A et al (2020) Evaluation of cotton emergence using UAV-based imagery and deep learning. Comput Elect Agricul 177:105711
Hunt ER et al (2011) NIR-green-blue high-resolution digital images for assessment of winter cover crop biomass. GIScience & Remote Sens 48(1):86–98
Liu S et al (2017) A method to estimate plant density and plant spacing heterogeneity: application to wheat crops. Plant Methods 13(1):38
Steduto P et al (2012) Crop yield response to water, vol 1028. FAO Rome
Walton JT, Nowak DJ, Greenfield EJ (2008) Assessing urban forest canopy cover using airborne or satellite imagery. Arboricul Urban Forestry 38(6):334–340
Chang A et al (2017) Crop height monitoring with digital imagery from unmanned aerial system (UAS). Comput Electron Agric 141:232–237
Malambo L et al (2018) Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int J Appl Earth Obs Geoinf 64:31–42
Sharma K, Singhal A (2016) New algorithm for flower yield estimation and yield comparison using Matlab. Imp J Interdisc Res 2(8)
Friedman JM, Hunt ER, Mutters RG (2016) Assessment of leaf color chart observations for estimating maize chlorophyll content by analysis of digital photographs. Agron J 108(2):822–829
Reyes JF, Correa C, Zúñiga J (2017) Reliability of different color spaces to estimate nitrogen SPAD values in maize. Comput Electron Agric 143:14–22
Schwarz MW, Cowan WB, Beatty JC (1987) An experimental comparison of RGB, YIQ, LAB, HSV, and opponent color models. ACM Trans Grap (TOG) 6(2):123–158
Dalezios N et al (2001) Cotton yield estimation based on NOAA/AVHRR produced NDVI. Phys Chem Earth Part B 26(3):247–251
Moges S et al (2005) Evaluation of green, red, and near infrared bands for predicting winter wheat biomass, nitrogen uptake, and final grain yield. J Plant Nutr 27(8):1431–1441
Ren J et al (2008) Regional yield estimation for winter wheat with MODIS-NDVI data in Shandong, China. Int J Appl Earth Obs Geoinf 10(4):403–413
Barnhart I et al (2019) Evaluating sorghum senescence patterns using small unmanned aerial vehicles and multispectral imaging. Kansas Field Res 2019:166
Thompson CN et al (2019) Using normalized difference red edge index to assess maturity in cotton. Crop Sci 59(5):2167–2177
Ludovisi R et al (2017) UAV-based thermal imaging for high-throughput field phenotyping of black poplar response to drought. Front Plant Sci 8:1681
Rischbeck P et al (2016) Data fusion of spectral, thermal and canopy height parameters for improved yield prediction of drought stressed spring barley. Eur J Agron 78:44–59
Alganci U et al (2014) Estimating maize and cotton yield in southeastern Turkey with integrated use of satellite images, meteorological data and digital photographs. Field Crop Res 157:8–19
Jones HG (2004) Application of thermal imaging and infrared sensing in plant physiology and ecophysiology. Advances in Botanical Research. Academic Press, pp 107–163
Alchanatis V et al (2010) Evaluation of different approaches for estimating and mapping crop water status in cotton with thermal imaging. Precision Agric 11(1):27–41
Bajwa SG, Vories ED (2007) Spatial analysis of cotton (Gossypium hirsutum L.) canopy responses to irrigation in a moderately humid area. Irrig Sci 25(4):429–441
Bian J et al (2019) Simplified evaluation of cotton water stress using high resolution unmanned aerial vehicle thermal imagery. Remote Sens 11(3):267
Oosterhuis DM (1990) Growth and development of a cotton plant. Nitrogen nutrition of cotton: practical issues, pp 1–24
Ritchie GL et al (2007) Cotton growth and development. Cooperative Extension Service and the University of Georgia College of Agricultural and Environmental Sciences: Athens, GA, USA
Main CL (2012) W287 cotton growth and development. The University of Tennessee, Institute of Agriculture
Reinbott DL, Stevens G (2021) Cotton tillage and planting guidelines. [cited 2021 06/27]. https://extension.missouri.edu/publications/g4270
Celik A, Ozturk I, Way T (2007) Effects of various planters on emergence and seed distribution uniformity of sunflower. Appl Eng Agric 23(1):57–61
Sansone C et al (2002) Texas cotton production: emphasizing integrated pest management. Texas, USA: Texas Cooperative Extension Service, the Texas A & M University System
Supak J (1990) Making replant decisions. in 1990 Beltwide cotton production conference. National Cotton Council of America, pp 45–48
Goodell PB et al (2015) UC IPM pest management guidelines cotton. Oakland, CA
Vories E et al (2007) Estimating the impact of delaying irrigation for Midsouth cotton on clay soil. Trans ASABE 50(3):929–937
Vories E, Jones A (2016) Influence of irrigation patterns on the effectiveness of furrow irrigation of Cotton. In World Environmental and Water Resources Congress 2016
Vories ED et al (2015) Impact of soil variability on irrigated and rainfed cotton. J Cotton Sci 19(1):1–14
Forcella F et al (2000) Modeling seedling emergence. Field Crop Res 67(2):123–139
Ghassemi-Golezani K, Dalil B (2014) Effects of seed vigor on growth and grain yield of maize. Plant Breeding Seed Science 70(1):81–90
Wiles LJ, Schweizer EE (1999) The cost of counting and identifying weed seeds and seedlings. Weed Sci 47(6):667–673
Chen R et al (2018) Monitoring cotton (Gossypium hirsutum L.) germination using ultrahigh-resolution UAS images. Prec Agricul 19(1):161–177
Oh S et al (2020) Plant counting of cotton from UAS imagery using deep learning-based object detection framework. Remote Sens 12(18):2981
Lin Z, Guo W (2021) Cotton stand counting from unmanned aerial system imagery using mobilenet and centernet deep learning models. Remote Sens 13(14):2822
Huete AR (1988) A soil-adjusted vegetation index (SAVI). Remote Sens Environ 25(3):295–309
Zhao B et al (2018) Rapeseed seedling stand counting and seeding performance evaluation at two early growth stages based on unmanned aerial vehicle imagery. Front Plant Sci, 9
Adão T et al (2017) Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens 9(11):1110
Cobb JN et al (2013) Next-generation phenotyping: requirements and strategies for enhancing our understanding of genotype–phenotype relationships and its relevance to crop improvement. Theor Appl Genet 126(4):867–887
Perry C et al (2012) Cotton irrigation management for humid regions, in Cotton Incorporated, Cary, NC
Loka D (2012) Effect of water-deficit stress on cotton during reproductive development
Allen RG et al (1998) Crop evapotranspiration-Guidelines for computing crop water requirements-FAO Irrigation and drainage paper 56. Fao, Rome 300(9):D05109
Khan A et al (2018) Coping with drought: stress and adaptive mechanisms, and management through cultural and molecular alternatives in cotton as vital constituents for plant stress resilience and fitness. Biol Res 51(1):47
Ballester C et al (2019) Monitoring the effects of water stress in cotton using the green red vegetation index and red edge ratio. Remote Sens 11(7):873
Sullivan D et al (2007) Evaluating the sensitivity of an unmanned thermal infrared aerial system to detect water stress in a cotton canopy. Trans ASABE 50(6):1963–1969
Chang A et al (2020) Measurement of cotton canopy temperature using radiometric thermal sensor mounted on the unmanned aerial vehicle (UAV). J Sens, 2020
Zhou Z et al (2021) Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comput Elect Agricul 182:106019
Chen S et al (2020) Retrieval of cotton plant water content by UAV-based vegetation supply water index (VSWI). Int J Remote Sens 41(11):4389–4407
Nahiyoon SA et al (2020) Biocidal radiuses of cycloxaprid, imidacloprid and lambda-cyhalothrin droplets controlling against cotton aphid (Aphis gossypii) using an unmanned aerial vehicle. Pest Manag Sci 76(9):3020–3029
Kaivosoja J et al (2021) Reference measurements in developing UAV systems for detecting pests, weeds, and diseases. Remote Sens 13(7):1238
Meng Y et al (2019) Harvest aids efficacy applied by unmanned aerial vehicles on cotton crop. Indus Crops Prod 140:111645
Yi L et al (2019) Exploring the potential of UAV imagery for variable rate spraying in cotton defoliation application. Int J Prec Agricult Aviat 2(1)
Xin F et al (2018) Effects of dosage and spraying volume on cotton defoliants efficacy: a case study based on application of unmanned aerial vehicles. Agronomy 8(6):85
Lou Z et al (2018) Effect of unmanned aerial vehicle flight height on droplet distribution, drift and control of cotton aphids and spider mites. Agronomy 8(9):187
Yadav PK et al (2019) Assessment of different image enhancement and classification techniques in detection of volunteer cotton using UAV remote sensing. in Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV. 2019. International Society for Optics and Photonics
Puig E et al (2015) New developments in UAV remote sensing for pest management and the implications for cotton
Huang H et al (2018) A two-stage classification approach for the detection of spider mite-infested cotton using UAV multispectral imagery. Remote Sens Lett 9(10):933–941
Xu R et al (2018) Aerial images and convolutional neural network for cotton bloom detection. Front Plant Sci 8:2235
De Castro AI et al (2018) An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens 10(2):285
Sapkota B et al (2020) Mapping and estimating weeds in cotton using unmanned aerial systems-borne imagery. Agri Eng 2(2):350–366
Wang T et al (2020) Automatic classification of cotton root rot disease based on UAV remote sensing. Remote Sens 12(8):1310
Xavier TW et al (2019) Identification of Ramularia leaf blight cotton disease infection levels by multispectral, multiscale UAV imagery. Drones 3(2):33
Wang T et al (2020) A plant-by-plant method to identify and treat cotton root rot based on UAV remote sensing. Remote Sens 12(15):2453
Thomasson J et al (2021) High-precision fungicide application for cotton disease based on UAV remote sensing. Precision agriculture’21. Wageningen Academic Publishers, pp 849–858
Deng X et al (2020) Detection and location of dead trees with pine wilt disease based on deep learning and UAV remote sensing. Agri Eng 2(2):294–307
Gao Z et al (2020) Deep learning application in plant stress imaging: a review. Agri Eng 2(3):430–446
Onder S et al (2005) Different irrigation methods and water stress effects on potato yield and yield components. Agric Water Manag 73(1):73–86
Bell JM et al (2018) Deficit irrigation effects on yield and yield components of grain sorghum. Agric Water Manag 203:289–296
Cuong TX et al (2017) Effects of silicon-based fertilizer on growth, yield and nutrient uptake of rice in tropical zone of Vietnam. Rice Sci 24(5):283–290
Schut AG et al (2018) Assessing yield and fertilizer response in heterogeneous smallholder fields with UAVs and satellites. Field Crop Res 221:98–107
Beres BL et al (2020) Towards a better understanding of Genotype× Environment× Management interactions–a global Wheat Initiative agronomic research strategy. Front Plant Sci 11:828
Feng A (2021) Quantifying the effect of environments on crop emergence, development and yield using sensing and deep learning techniques. University of Missouri--Columbia
Komm B, Moyer M (2015) Vineyard yield estimation
Feng A et al (2019) Cotton yield estimation from UAV-based plant height. Trans ASABE 62(2):393–404
Yeom J et al (2018) Automated open cotton boll detection for yield estimation using unmanned aircraft vehicle (UAV) data. Remote Sens 10(12):1895
Xu W et al (2020) Establishing a model to predict the single boll weight of cotton in northern Xinjiang by using high resolution UAV remote sensing data. Comput Electr Agricul 179:105762
Xu W et al (2021) Cotton yield estimation model based on machine learning using time series UAV remote sensing data. Int J Appl Earth Observ Geoinf 104:102511
Maimaitijiang M et al (2020) Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens Environ 237:111599
Yang Q et al (2019) Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crop Res 235:142–153
Zhang M et al (2020) Estimation of maize yield and effects of variable-rate nitrogen application using UAV-based RGB imagery. Biosys Eng 189:24–35
Ashapure A et al (2020) Developing a machine learning based cotton yield estimation framework using multi-temporal UAS data. ISPRS J Photogramm Remote Sens 169:180–194
Schwalbert RA et al (2020) Satellite-based soybean yield forecast: Integrating machine learning and weather data for improving crop yield prediction in southern Brazil. Agricult Forest Meteor 284:107886
Chu Z, Yu J (2020) An end-to-end model for rice yield prediction using deep learning fusion. Comput Elect Agricul 174:105471
Khaki S, Wang L (2019) Crop yield prediction using deep neural networks. Front Plant Sci, 10
Khaki S, Wang L, Archontoulis SV (2020) A cnn-rnn framework for crop yield prediction. Front Plant Sci 10:1750
Lipton ZC, Berkowitz J, Elkan C (2015) A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv:1506.00019
Lee SH et al (2020) Attention-based recurrent neural network for plant disease classification. Front Plant Sci, 11
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Feng, A., Vong, C.N., Zhou, J. (2022). Unmanned Aerial Vehicle (UAV) Applications in Cotton Production. In: Zhang, Z., Liu, H., Yang, C., Ampatzidis, Y., Zhou, J., Jiang, Y. (eds) Unmanned Aerial Systems in Precision Agriculture. Smart Agriculture, vol 2. Springer, Singapore. https://doi.org/10.1007/978-981-19-2027-1_3
Download citation
DOI: https://doi.org/10.1007/978-981-19-2027-1_3
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-2026-4
Online ISBN: 978-981-19-2027-1
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)