Abstract
Precision agriculture (PA) is the application of geospatial techniques and sensors (e.g., geographic information systems, remote sensing, GPS) to identify variations in the field and to deal with them using alternative strategies. In particular, high-resolution satellite imagery is now more commonly used to study these variations for crop and soil conditions. However, the availability and the often prohibitive costs of such imagery would suggest an alternative product for this particular application in PA. Specifically, images taken by low altitude remote sensing platforms, or small unmanned aerial systems (UAS), are shown to be a potential alternative given their low cost of operation in environmental monitoring, high spatial and temporal resolution, and their high flexibility in image acquisition programming. Not surprisingly, there have been several recent studies in the application of UAS imagery for PA. The results of these studies would indicate that, to provide a reliable end product to farmers, advances in platform design, production, standardization of image georeferencing and mosaicing, and information extraction workflow are required. Moreover, it is suggested that such endeavors should involve the farmer, particularly in the process of field design, image acquisition, image interpretation and analysis.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Although an essential element of our livelihoods, agricultural production can adversely impact the surrounding environment, especially in regards to water quality. One unique characteristic of this industry is that it provides biological products which are extremely sensitive to both environmental conditions and management practices (Swain et al. 2007). It is thus critical that farmers know, in a timely fashion, where variations exist in their fields so as to adjust their practices accordingly. Consequently, any techniques that facilitate the stabilization or the increase of agricultural production while mediating the impacts of this activity on the environment will be a benefit to society. For example, sensors capable of detecting field variability, such as variable rate (application) technology (VRT), and grain yield monitors have been used in combination with high position accuracy GPS to deal with these issues (Stafford 2000; Price 2004; Robertson et al. 2007). Such systems come under the umbrella terms precision agriculture (PA), precision farming, site-specific crop management or even site-specific farming. These terms suggest that agricultural management can be practiced with a degree of precision. Consequently, PA provides an alternative and realistic means to reduce and optimize the use of potentially harmful compounds and thus can promote a healthier environment for humans. Although beneficial, there are some key issues in PA which need to be addressed including the lack of appropriate decision support systems and insufficient recognition of temporal variation and environmental auditing (McBratney et al. 2005).
With the development of new sensing technologies and the ever increasing applications of PA, a larger number of agricultural equipment of increasing complexity is being developed. Currently VRT is the most advanced area of PA. However, the operational success of VRT requires accurate maps of crop growth, weeds, insect infestations, crop nutrient deficiencies, and other crop and soil conditions (Moran et al. 1997). PA practitioners use these data to match nutrient supply (and other operations) to the spatial variability of crop plants (Cook and Bramley 1998; Robertson et al. 2007). As such, short revisit times are a necessary component allowing a prompt response to unfavorable crop or field conditions. With the current trend towards increasing farm size, it is also becoming more costly to acquire larger datasets to assess field variability (Quilter 1997). Satellite images and aerial photos collected during the growing season have been used to monitor crop growth, crop stress, and to predict crop yield (Stafford 2000; Warren and Metternicht 2005), but so far their use has been limited by poor revisiting times and/or coarse spatial resolutions (Moran et al. 1997; Stafford 2000). Moreover, the use of manned airborne platforms is limited by high operational complexity, costs and lengthy delivery of products (Moran et al. 1997; Zhang et al. 2006; Berni et al. 2009a Rango et al. 2009). With recent technological advances in aerospace engineering, the Low Altitude Remote Sensing (LARS) system, a relatively new concept of acquiring Earth surface images at a low altitude using unmanned aerial systems (UAS) (Swain et al. 2007), is being promoted as an alternative platform for this purpose. The ultra-high spatial resolution (e.g. centimeters), relatively low operational costs, and the near real-time image acquisition (Hunt et al. 2005; Lelong et al. 2008; Nebiker et al. 2008; Rango et al. 2009; Hardin and Hardin 2010; Xiang and Tian 2011) would indicate that these platforms are ideal tools for mapping and monitoring in PA. The purpose of this review is to provide insight on these particular devices by defining the application of remote sensing in PA and by exploring the current uses and limitations of UAS in environmental monitoring and PA, including platform and camera restrictions, UAS data processing issues, farmer participation, and aviation regulations. In addition, potential advances and alternative uses of UAS are discussed, which may help facilitate their use and attract farmers’ interest in their application within PA.
Remote sensing and PA
The general stages of PA practice are data collection, field variability mapping, decision making, and finally management practice. It is suggested that remote sensing could be involved in the first three of these stages (Stafford 2000; Warren and Metternicht 2005). In particular, it is critical to obtain up-to-date images/maps during the process of decision making, thus field variability could be mapped using remotely sensed imagery.
Environmental remote sensing essentially makes use of radiant energy to extract information on ground features along large swath areas within a short period of time. Remote sensing, along with other geospatial techniques, has been applied in agriculture since the 1950s (Colewell 1956). However, the earlier use of aerial photographs in the 1930s by the Agricultural Adjustment Administration of the United States for precise measurements of cropland has also been documented (Monmonier 2002). The basic underlying premise of remote sensing applications in PA is that differences in crop growth and soil condition can be detected through variations within the spectral responses (Warren and Metternicht 2005). Specifically, changes in remotely sensed reflectance can be detected before symptoms such as fungal and mildew leaf infections become visible to the human eye (Lorenzen and Jensen 1989; Malthus and Maderia 1993).
Common remote sensing platforms include satellites, airplanes, balloons and helicopters, and a variety of sensors such as optical and near infrared sensors and RADAR (RAdio Detection And Ranging) are installed on these platforms for remote sensing applications. Diagnostic information derived from images collected from these on-board sensors, such as biomass, Leaf Area Index (LAI), disease, water stress and lodging, can thus assist in crop management, yield forecasting, and environmental protection. General agricultural applications using remote sensing include soil properties monitoring and mapping (e.g., De Tar et al. 2008; Gomez et al. 2008), crop species classification (e.g., Rao et al. 2007; Rao et al. 2008), crop pest management (e.g., Lan et al. 2009), plant water stress detection (e.g., Lelong et al. 1998; Erickson et al. 2004), leaf chemical content analysis (e.g., Wu et al. 2007; Wu et al. 2008) and weed control monitoring (e.g., Lamb and Brown 2001; Scotford and Miller 2005; Gutierrez et al. 2008). Many crop species have been monitored using remote sensing techniques including canola, corn, cotton, sorghum, and wheat (e.g., Lelong et al. 1998; Yang et al. 2001; Seelan et al. 2003; Warren and Metternicht 2005; Zhao et al. 2007). Aerial photography has been used in yield mapping to identifying field variation (Godwin et al. 2003), but most images used in these studies are either airborne hyperspectral (e.g., Godwin et al. 2003; De Tar et al. 2008; Lelong et al. 1998), satellite hyperspectral (e.g., Rao et al. 2008), or satellite multi-spectral (e.g., Gomez et al. 2008; Clevers 1988; Du et al. 2008). In addition, close range hyperspectral techniques have also been used to examine field canopy spectral responses (e.g., Zhao et al. 2007; Gomez et al. 2008; Rao et al. 2008; Gomez-Casero et al. 2010). To date, the dominant remotely collected images for PA have been optical. However, synthetic aperture radar (SAR) images have recently been shown to be useful in crop-type mapping and condition assessment, soil tillage and crop residue mapping, soil moisture estimation, yield prediction, and zone management (McNairn and Brisco 2004). The limited application of SAR images in agriculture is most likely due to the costs, timing, and interpretation of these data (McNairn and Brisco 2004).
In comparison to satellite imagery, UAS acquired images typically have higher temporal (e.g. daily acquisitions) and higher spatially resolutions (e.g. centimeters), which makes it necessary to examine the application of high resolution images in PA. In regards to high resolution imagery there have been many studies examining crop conditions, particularly those involving LAI. LAI is the fundamental canopy parameter in two basic physiological processes, photosynthesis and evapotranspiration, which are most dependant on solar radiation (Warren and Metternicht 2005). The majority of crop growth and crop yield models require an estimate of LAI of the crop canopies. LAI is a basic parameter that can link multi-spectral remote sensing to crop growth and condition for biological measurements (e.g., Wu et al. 2007; Lopez-Lozano et al. 2009). Other estimated measurements using remote sensing data include surface soil properties (e.g., Sullivan et al. 2005), water stress (Zarco-Tejada et al. 2012), vegetation cover (e.g., Laliberte et al. 2005), nitrogen content (e.g., Beeri et al. 2005; Zhang et al. 2006; Shou et al. 2007; Bausch and Khosla 2010), crop height (e.g., Donoghue et al. 2006), above ground biomass (e.g., Beeri et al. 2005), crop yield (e.g., Enclona et al. 2004), weed extent (e.g., Gomez-Casero et al. 2010), and crop species (e.g., Pena-Barragan et al. 2008; Castillejo-Gonzalez et al. 2009). These data have also been used to monitor other biological parameters such as leaf chlorophyll content (or concentration), and leaf nitrogen content (or concentration) over time (Stafford 2000; Berni et al. 2009a, b). Many of the aforementioned parameters are determined using vegetation indices which are amongst the most used tools in agricultural remote sensing. In particular, vegetation indices such as the Normalized Difference Vegetation Index (NDVI), the Soil Adjusted Vegetation Index (SAVI), and the G(reen)NDVI, have been used extensively to study various biological parameters quantitatively (Hunt 2005; Swain et al. 2007; Lelong et al. 2008). However, a common problem encountered with the use of these indices in agricultural studies is saturation at high LAI (Haboudane et al. 2004).
Applications of remote sensing in general have been shown to be beneficial and profitable (Godwin et al. 2003; Seelan et al. 2003; Tenkorang and DeBoer 2007), but current applications in PA are still limited. According to a survey conducted in the USA in 2009, the adoption of satellite imagery and aerial photos in PA has changed from 16.1 to 30.3 % between 2004 and 2009 (Whipker and Akridge 2009). However, the number of individuals employing remote sensing within PA technologies remains quite small (Warren and Metternicht 2005; Whipker and Akridge 2009). The limitations identified for applications of remote sensing systems in farm management include: the collection and delivery of images in a timely manner, the lack of high spatial resolution data, image interpretation and data extraction issues; and, the integration of these data with agronomic data into expert systems (Jackson 1984; Du et al. 2008). For example, some particular farm management decisions such as weed detection and weed management require very high spatial resolution imagery often in the order of centimeters (Robert 1996). Unfortunately, the remote sensing data most readily available, the medium resolution satellite imagery (e.g., Landsat TM, ASTER, SPOT 5), are only useful for large scale studies. Moreover, even the newest higher resolution satellite imagery (e.g., WorldView-2 and GeoEye-1) cannot provide high frequency data for emergent situation (e.g., nutrient stress monitoring, disease) with a limited 1–2 day revisit period. Weather conditions are also a critical impediment for satellite image acquisitions since the growing season is also typically the rainy season. In addition to the actual sensor limitations, remote sensing applications in agriculture are also limited by interpretation and analyses issues that result when dealing with very large amounts of digital data (Murakami et al. 2007). Moreover, there has been a perceived lack of usefulness in mapping growing crops and relatively few reliable remote sensing analysts or consulting firms who can deal with these products (Griffin et al. 2004). Finally, it is suggested (Griffin et al. 2004) that one main reason for the low adoption rate of remote sensing images in PA is the unavailability of reliable economic estimates of return from the application of remotely sensed images. It is believed by some (Stafford 2000; Lamb et al. 2008) that the costs, availability, flexibility, and the processing of remotely sensed imagery from satellites have made their applications prohibiting and thus non-practical.
Small UAS and environmental/PA studies
As indicated, the costs and availability of high resolution satellite imagery often limit their applications in PA (Wu et al. 2007). Consequently, UAS could be an inexpensive and more practical substitute for satellite and general aviation aircraft for high resolution remotely sensed data. Moreover, UAS are immediately accessible as a tool for remote sensing scientists and farmers (Swain et al. 2010). In recent years, small commercial UAS (<50 kg) (Laliberte and Rango 2011) have been available for environmental and agricultural applications. Since perhaps the first reported case of UAS civilian application (Tomlins 1983), fixed wing aircraft, helicopter and the most recent generation of quadrocopter micro UAS with vertical take-off and landing (VTOL) capability have been shown to be useful instruments in local agricultural applications (Nebiker et al. 2008; Rango and Laliberte 2010).
The small UAS belong to LARS, an ever popular set of tools for environmental monitoring. The flexibility in acquisition times and the much lower costs for image extraction could possibly result in the small UAS industry exceeding the demand from the traditional manned aircraft one. Images captured using UAS generally have a spatial resolution of centimeters and the acquisition of the images is manageable and not as influenced by cloud cover. Therefore, UAS acquired images could be a practical alternative to aerial photos and high resolution satellite images. Although there have not been many case studies of UAS in PA, examples of applications of UAS in environmental studies are becoming increasingly more apparent in the literature. Such studies may provide insight as to how applicable these instruments may be for various PA endeavors.
The use of small UAS has flourished over the last decade with various LARS platforms being used specifically to monitor environmental change. There are currently a plethora of these instruments available at various costs. Some of the more inexpensive LARS platforms used to monitor environmental conditions include blimps (Inoue et al. 2000; Vericat et al. 2008), balloons (Amoroso and Arrosmith 2000; Seang and Mund 2006), and kites (Aber et al. 2002; Wundram and Loffler 2007; Aber et al. 2009). Although relatively cheap, the maneuverability of these platforms makes it challenging for crop condition monitoring. Consequently, the most popular LARS platform for environmental studies has been the small UAS, which are also known as or include unmanned aerial vehicles (UAV), remotely piloted vehicles (RPV), remotely operated aircraft (ROA), and remotely controlled helicopters (RC-Helicopter) (Eisenbeiss 2004). In recent years, there have been more applications using small unmanned aircrafts (e.g., Hunt et al. 2003; Eisenbeiss 2004; Hunt 2005; Johnson et al. 2004; MacArthur et al. 2005; Laliberte et al. 2007; Huang et al. 2008; Laliberte et al. 2010), unmanned helicopters (e.g., Sugiura et al. 2002; Sugiura et al. 2004; Swain et al. 2007; Berni et al. 2009a, b; Swain et al. 2010), powered gliders (Lelong et al. 2008), powered parachute (Lelong et al. 2008; Aber et al. 2010) and quadrocopters (i.e. quadrotor helicopter) (e.g., Kendoul et al. 2007; Nebiker et al. 2008; Primicerio et al. 2012) (Fig. 1). These platforms include assembled units from off-the-shelf parts (e.g., Hardin and Jackson 2005; Xiang and Tian 2011) or complete commercially available ones (e.g., Eisenbeiss 2004; Laliberte and Rango 2011).
In addition to the variety of platforms available, there are numerous types of remotely sensed sensors that can be used for the actual data collection. Sensors for image capture include film cameras (e.g., Tomlins 1983; Quilter 1997; Aber et al. 2002; Hunt et al. 2003), off-the-shelf commercial grade (non-metric) digital cameras (e.g., Amoroso and Arrowsmith 2000; Inoue et al. 2000; Quilter and Anderson 2000; Eisenbeiss 2004; Chandler et al. 2005; Seang and Mund 2006; Lewis 2007; Lelong et al. 2008; Nebiker et al. 2008) and even modified digital cameras with a near infrared band (e.g., Hunt et al. 2008). Specific UAV cameras have also been developed and include the ADC multi-spectral camera (e.g., Swain et al. 2007; Xiang and Tian 2011; Primicerio et al. 2012) and the MCA mapping camera (e.g., Berni et al. 2009a, b; Gomez-Candon et al. 2011).
To obtain images for further georeferencing and/or mosaic or to obtain images for predefined points, GPS/INS data onboard the UAS and a ground control station with a flight planning system are needed (Eisenbeiss 2004; Laliberte and Rango 2011; Xiang and Tian 2011; Primicerio et al. 2012). Images captured can thus be transmitted (i.e. down-linked) to the ground station or stored in the unit’s sensor memory until the vehicle has landed. There are also navigation and flight control components for the newer generation of UAS (Eisenbeiss 2004; Laliberte and Rango 2011; Xiang and Tian 2011). The navigation component is used to control the flight path of the UAS and to monitor and/or correct the flight status (position and orientation) of the platform. The flight control component is used to maintain the stability of the platform in order to make sure that the position of the platform is optimal for image acquisitions (Hardin and Jensen 2011; Xiang and Tian 2011). The ground station provides a user interface that incorporates flight planning, flight control and/or image acquisition (Laliberte and Rango 2011; Xiang and Tian 2011) (Fig. 2).
To date, UAS acquired images have been successfully employed for estimating the degree and extent of shrub utilization (Quilter and Anderson 2001), for mapping grass species (Hardin and Jackson 2005), for mapping forest fires (Hinkleya and Zajkowski 2011), for measuring shrub biomass (Quilter and Anderson 2001; Hardin and Jackson 2005), for aiding in vineyard management (Primicerio et al. 2012) and for mapping rangeland vegetation (Laliberte and Rango 2009). With regards to agriculture, they have been used for detecting small weed patches in rangelands (Hardin et al. 2007), for documenting water stress in crops (Berni et al. 2009a), for monitoring crop biomass (Hunt 2005; Swain et al. 2010), for mapping vineyard vigour (Primicerio et al. 2012) and for examining the results of various nitrogen treatments on crops (Hunt 2005; Swain et al. 2007). UAS have also been used to assess irrigation systems at the field scale (Fig. 3). The crop types examined using UAS-collected data include rice (Swain et al. 2007; Swain et al. 2010), wheat (Hunt et al. 2010), corn (Hunt 2005), turf grass (Xiang and Tian, 2011), and even coffee (Johnson et al. 2004). Furthermore, UAS have also been used to sample pollen (Aylor et al. 2006), spore (Maldonado-Ramirez et al. 2005), and agricultural disease agents (Schmale et al. 2008). Unfortunately, unlike satellite remote sensing, there have been a minimal number of quantitative studies on crop growth based on UAS acquired imagery (Hunt et al. 2005; Hunt 2005; Lelong et al., 2008; Berni et al. 2009a, b; Hunt et al. 2010). These studies examined relationships of vegetation indices derived from UAS-based sensors with corn nitrogen status and soybean biomass (Hunt 2005), various rice nitrogen treatments (Swain et al. 2007) and with wheat LAI (Lelong et al. 2008).
Limitations of UAS for PA applications
It has been suggested that successful applications of UAS supported image capture could shorten the time frame needed for agricultural practice adjustment and that the results of this remote sensing monitoring could exceed those from traditional control treatments (Beeri and Peled 2009). Although there are already a variety of UAS applications, problems still exist in platform reliability, sensor capability, image processing and final products dissemination.
Platform and cameras
There are well known technical problems for small UAS such as engine power, short flight duration, difficulties in maintaining flight altitude, aircraft stability and maneuverability in winds and turbulence (Laliberte et al. 2007; Hardin and Hardin 2010). Reliability is yet another concern for UAS applications (Hardin and Hardin 2010; Hardin and Jensen 2011). The principal concerns in UAS reliability are platform failures due to inadequate building materials and engine breakdown (Hunt et al. 2003; Jones et al. 2006; Hardin and Hardin 2010). For a detailed technical explanation of such limitations for small UAS platforms the reader is referred to Hardin and Jensen (2011).
Another key issue which restricts sensor selection is UAS payload weight. Payload weight limitations for UAS are generally 20–30 % of the total weight of the system (Nebiker et al. 2008), which greatly influences the sensor system that can be attached. Consequently, several consumer (non-metric) cameras had been tested for crop monitoring due to their low weights and low costs (e.g., Lelong et al. 2008). However, there are many known problems with these cameras including limited optical quality, zoom lenses, fully automatic focusing (Nebiker et al. 2008) and, most importantly for vegetation surveys, a lack of a near infrared band (NIR). Practical multi-spectral cameras specific for small UAS that have the NIR band, such as mini MCA and ADC lite (Tetracam, Inc., Chatsworth, CA, United States of America), are currently quite expensive.
Depending on the platform and sensor system selected, the cost of a combined system can be extremely prohibitive for UAS PA applications. The cost of a UAS equipped with a GPS can range up to US$ 100,000 (Rango et al. 2009) and for that of a UAS multi-spectral camera upwards of US$ 15,000. Moreover, high repair costs and pilot aids can further compound the costs associated with such platforms (Hardin and Hardin 2010). Compared with an acceptable cost of $ 8,000 (Lewis 2007) or $100/per flight (Jones et al. 2006), the current costs for such systems can be intimidating to many environmental scientists (Hardin and Hardin 2010), not to mention PA practitioners. However, it is anticipated that a sudden drop in UAS prices may occur in the near future given increasing competition between UAS producers and the increasing civilian applications of UAS (Eugster and Nebiker 2007).
UAS image processing
UAS acquired images typically possess higher radiometric homogeneity than either aircraft or satellite imagery due to the low altitude of acquisition (Lelong et al. 2008). However, UAS systems also inherit unique image quality problems. For example, the light weight of many UAS systems means a less stable camera position resulting in different spatial resolution and/or different viewing angles from one image to another along the same flight plan (Lelong et al. 2008). The low flying altitude of these craft can also result in severe geometric distortion (Lelong et al. 2008; Xiang and Tian 2011). Moreover, the low altitude also results in a larger number of UAV images for each field. There are also technical problems with imagery captured by UAS-borne cameras including blur from camera forward-image motion. To compensate, oversampling is commonly employed, but this results in increased data volume (Aber et al. 2010).
With the large amount of images captured, image mosaicing is a necessary preprocessing step. Although there have been successful cases using manual geometric correction (e.g., Hardin et al. 2007; Vericat et al. 2008) this might not be feasible for the application in PA where larger areas need to be monitored. With hundreds of images often captured and the request for a fast response by growers, the turnaround time (i.e. time to process and deliver the final products to users) becomes a main concern for these devices. Consequently, automatic processing is often the bulk of the UAS data processing. An automatic or semi-automatic photogrammetric approach (block file) can better address changes in roll, pitch, and yaw common with UAS (Laliberte and Rango 2011; Xiang and Tian 2011). However, geometric correction and ortho-rectification are also required before the images can be merged because of the small swath area and the instability of the platform. Methods that have been developed to address these issues include: manual georeferencing using ground collected GCPs, photo match, and automatic georeferencing using navigation data along with a camera lens distortion model (Xiang and Tian 2011).
Furthermore, it is suggested that for practical purposes, the quality of the images collected from a UAS platform should first be examined so that the most appropriate preprocessing routine be selected (Berni et al. 2009a). There are many other issues which may need to be addressed including multi-angle effects, bidirectional reflectance distribution function (BRDF) effects, vignetting effects, and impacts from clouds and/or shadows (Lelong et al. 2008; Aber et al. 2010). Although Lelong et al. (2008) proposed methods to deal with vignetting and BRDF effects, more experiments are required in order to best address these issues and to facilitate the next stage of information extraction. However, UAS based remote sensing can be a good way of testing BRDF models for other remote sensing platforms (Aber et al. 2009).
Although an issue, most UAS studies to date have applied very little image preprocessing or simply used the raw Brightness Value (BV) data directly. This is in part due to the fact that many of these investigations used small consumer grade digital cameras to capture data. Furthermore, only a few studies have converted BV to reflectance prior to the calculation of the vegetation indices (e.g., Hunt 2005; Berni et al. 2009b). Consequently, the difficulties of calibration and even geometric correction may significantly hinder the use of aerial images (Moran et al. 1997). As Hardin and Jensen (2011) pointed out, current image processing procedures were developed for specific application and there lacks a generalized procedure.
In summary, problems of UAS image application are generally identical to those for the applications of traditional aerial and satellite images as reviewed by Moran et al. (1997). These problems include: instrument calibration, atmospheric correction, vignetting correction, line-shift correction, band-to-band registration, and frame mosaicing. For most applications, image processing procedures must be automated so that final image products could be delivered in a timely fashion. This is of utmost importance for UAS remote sensing (Hardin and Jensen 2011) given the amount of data collected, the extent of distortions and the turnaround times required.
Farmers’ interests in UAS adaptation
Although PA practice is gaining popularity worldwide (Stafford 2000; Robertson et al. 2007), the use of remote sensing techniques in agriculture is still limited. Currently, the United States and Australia are considered the two countries most engaged in applications of PA (Warren and Metternicht 2005). The increasing use of PA by farmers in the United States has been reported in a recent 2009 survey conducted by the Center for Food and Agricultural Business (Whipker and Akridge 2009). According to this study, GPS guidance systems are the most commonly adopted PA technique, increasing steadily at an adoption rate of 61.1–78.6 % between 2003 and 2009. However, the application of other techniques, including mapping (GIS and remote sensing) technology, was shown to be much lower. According to Stafford (2000) the main driver of PA is maximizing profit and farmers are simply not certain of the economic benefits in investing in satellite imagery and other spatial products. Moreover, skills are needed for spatial analysis (i.e., the production of maps based on integrated data sources), which many farmers lack (Griffin et al. 2004; Price 2004; Fisher et al. 2009). Moreover, the uncertainty about the reliability of yield maps for crop management further reduces their perceived value (Stafford 2000) even though several on-farm trials have shown benefits of at least $20/ha based on zone management practices (Godwin et al. 2003; Seelan et al. 2003; Robertson et al. 2007). Although there are several studies on the profitability of PA adoption (e.g. Lambert and Lowenberg-Deboer 2000; Silva et al. 2007) there is little mention as to the economic benefits of UAS remote sensing to PA (Tenkorang and DeBoer 2007). Consequently, it has been suggested by Tenkorang and DeBoer (2007) that detailed cost/benefit analyses with enough replicates should be conducted in this field of PA.
The farmer’s knowledge of the variation in their fields is also an important factor that influences the perception and adoption of PA technologies (Adrian et al. 2005). It was found that those individuals with more knowledge of the spatial variation in soil properties of rented land were more likely to acknowledge and adopt PA technology (Torbett et al. 2008).
In addition to the lack of information to farmers regarding the use of UAS, previous UAS-based studies used experimental crop fields and simply did not involve PA practitioners. These cropland cultivations were specifically designed to assure ideal crop condition for remote sensing experiments. As such, it is not plausible to currently assess the feasibility of the use of UAS-acquired imagery by the farmers under more realistic conditions. Fortunately, a previous study using a community-based participation model was shown to be very successful with Landsat and high resolution satellite imagery (Seelan et al. 2003). For UAS, this might indicate that the current research has little to no direct impacts on farmers’ potential adoption of geospatial techniques in their practices.
Issues arising from aviation regulations
As pointed out by Hardin and Jensen (2011) and Laliberte and Rango (2011), aviation regulations could be one of the most important impediments to the adoption of UAS in environmental and agricultural applications. In many countries these regulations are quite loose such as in Germany where no permission is required for UAS with less than 5 kg overall mass and flight site 1.5 km away from residential areas and airfields (Aber et al. 2010). However, in the USA a Certificate of Authorization (COA) is required as well as a large ground team during operation. Similarly, in Canada a Special Flight Operations Certificate (SFOC) is required on an annual basis and a certain level of UAS training is highly recommended. As part of the certificate, insurance is required in the case of failure of the platform, which may cause damage to humans, livestock or buildings. These types of requirements are seen as the largest obstacle for the application of UAS in environmental studies (Hardin and Jensen 2011). In addition to the required certificates, other rules affect the actual operation of the UAS. For example, in Canada and the United States, the UAS must always be in the view of the operator (i.e. line-of-sight) and, in the case of the USA, a pilot license is also required of the operator. Consequently, it is necessary to have a flying team (Rango and Laliberte 2010) during operations which adds to the cost of the UAS operation. The flying height regulations also limit the spatial resolution and swath area of these images. The maximum altitude for UAS in Canada is 400 feet, or 120 m, even though 640 m is reported to be the minimum height in avoiding major impacts from turbulence (Lewis 2007). It is believed that such aviation regulations need to be relaxed in order for the use of UAS in environmental monitoring to succeed (Rango and Laliberte 2010).
Future application of UAS in PA
It is thus clear that the application of UAS in PA is still in its infancy and that there is considerable room for further development in regards to both the technology and the various potential applications. The improved agreement between UAS imagery and field ground truth data, the flexibility of acquisition times and the relatively low cost should encourage others to consider these devices as key tools in future precision agricultural endeavors. For now, the successful application of UAS for monitoring rangeland conditions (Rango and Laliberte 2010; Laliberte and Rango 2011) demonstrates that using UAS in PA could be the next stage of success in remote sensing applications. This could consolidate the integration of UAS in PA and thus facilitate the maturation of technology development. As it currently stands, the future applications might be ascribed to the improvement of UAS design, lower system prices, enhanced image processing methods/procedures, looser regulations for environmental applications and practical information dissemination systems for the farmers.
Advancement of UAS, cameras and image processing
With all the UAS shortcomings summarized by Hardin and Hardin (2010), including instability, image distortion, payload limitation and short flight times, these systems are still gaining considerable popularity among environmental scientists as indicated by the plethora of previously cited studies. As suggested by Hardin and Jensen (2011), most technological problems could be solved with new technology development and with collaboration between environmental scientists and UAS engineers. It is anticipated that over the next few years, more durable and more powerful platforms will appear on the market with more advanced camera technology employing NIR, and that a standardized automatic georeferencing procedure will be developed for most UAS.
The current lack of selection for UAS multi-spectral cameras (Hardin and Hardin 2010) might be solved by the increasing payload capacity (Nebiker et al. 2008) and by more commercially available multi-spectral cameras. This has successfully occurred in the autonomous aerobiological sampling UAS industry (Schmale et al. 2008). Moreover, further experiments based on field hyperspectral measurements and UAS image acquisition should be conducted to best select those spectral bands which are most applicable for agricultural applications. Cameras could then be customized resulting in reduced costs for these devices. With the growth of the civilian UAS market, it is also anticipated that a quick decrease in platform prices will occur (Jones et al. 2006). The availability of low cost platforms and specialized cameras could excite farmers in adopting these systems.
With regards to the issue of large data volumes in UAS image processing, a general standardized procedure (i.e. streamline of georeferencing) is still needed for full integration of UAS in PA (Hardin et al. 2007; Hardin and Jensen 2011). Currently, different research groups are exploring various software in order to improve image preprocessing and mosaicing (e.g., Laliberte and Rango 2011; Xiang and Tian 2011). With increasing on-board GPS accuracy and exterior orientation data, the work flow of field sampling, processing, and classification of UAS-acquired images could be further standardized as has been recently shown for rangeland management (e.g., Laliberte and Rango 2011). Xiang and Tian (2011) also demonstrated that automatic orthophoto-based georeferencing is feasible for UAS acquired images in turf grass management. Such automated registration could be achieved in PA through the combination of platform altitude and on-board GPS co-ordinates. Fortunately, progress in this procedure has been reported with mosaic accuracy having changed from 20 pixels (Moran et al. 1997) to less than 14 pixels (Xiang and Tian 2011) and even 1–2 pixels (Zhou 2009). However, more experiments on larger cropland areas (i.e. >1 ha) should be conducted in order to showcase the feasibility of applying standardized procedure in PA, particularly where topography is undulating.
Methods of information extraction from UAS images
Similar to satellite images and aerial photos, UAS images could be applied in zonal mapping, used to quantify biological variables and employed to identify anomalies in croplands. Zonal mapping can be a key component of PA since accurate information about soil and crop spatial variation is necessary. One of the more common methods of zonal mapping for soils involves soil sampling of physical and chemical properties based on a grid (cell or centre) which can then be used for spatial interpolation (Stafford 2000; Flowers et al. 2005; Song et al. 2009). However, in some countries such as Australia a more directed soil sampling strategy is employed (McBratney et al. 1997). Long et al. (1995) showed that yield data from aerial photographs were found to be more accurate than from traditional geostatistical interpolation and soil survey maps. Zonal maps can also be based on several years of historical yield data collected using yield monitors (Blackmore 2000; Diker et al. 2004; Flowers et al. 2005). Such maps have been used as the input for fertilizer application procedures and other practices (Moran et al. 1997). However, data from yield monitors can contain many sources of error (Lamb et al. 1995). Notwithstanding, soil sampling is expensive and time consuming (Stafford 2000) and zonal maps based on yield need further input of the current year’s condition (Blackmore et al. 2003). Consequently, high-resolution satellite images have been applied as an alternative in monitoring vegetation and soil conditions. For example, a zonal map was recently produced using IKONOS-derived NDVI and was shown to be significantly cheaper than maps produced from grid-based soil samples (Seelan et al. 2003; Song et al. 2009). Moreover, Yang et al. (2006) found that there were high correlations between classification maps derived from remote sensing and production maps. Consequently, a zonal map production procedure based on UAS images could provide extremely important information for farmers.
An automatic target identification procedure could also be set up to identify crop field anomalies (e.g., weeds, diseases, drought stress) in UAS images. Moreover, the extraction of various biological variables (e.g., LAI, canopy chlorophyll concentration, and yield) from UAS imagery should be tested. The ability to accurately estimate plant chlorophyll concentration can provide growers with valuable information to estimate crop yield potential and to make decisions regarding N management (Swain et al. 2010). Capturing UAS images throughout the growing season for one particular crop could also help to decide the critical date of image acquisition for many biological variables.
Attracting farmers interests and conducting cost effective analysis
In order to extend applications of UAS in PA, remote sensing scientists could work more closely with PA practitioners thus showcasing the ability of UAS and their potential profitability through cost/benefit analysis. Many farmers are eager to apply modern techniques in their practices (Seelan et al. 2003) in order to maximize profit but, as previously stated, the costs, the delivery, and the information extraction associated with remote sensing data have been a hindrance for their use in agricultural practices (Stafford 2000). However, Seelan et al. (2003) have shown that using a learning community that integrates research scientists, farmers and data providers together can reduce the time needed for image dispersal. In their community study, Seelan et al. (2003) used Landsat, high resolution IKONOS imagery and aerial photos. The high-resolution images were used for visual interpretation, classification and validation and the study was based on a large spatial scale. Using an available UAS, the cost effectiveness in conducting a similar type of community integrated remote sensing program could be significantly improved. For example, Beeri and Peled (2009) showed that applications using UAS captured imagery could shorten the time frame needed for agricultural practice adjustment and that the results of the remote sensing monitoring were better than those from traditional control treatments. Furthermore, demonstrating such applications is essential for convincing farmers of the benefits of using geospatial technology on their farms. In addition, examples of economic and environmental benefits are also critical if a larger number of end users are to be convinced of their importance. In any event, it has been shown that early adopters and innovators often motivate others through their successes (Seelan et al. 2003). In such cases, the value of the technology is warranted as the information derived enables a decision that results in higher profitability and more responsible stewardship. To achieve this, cost-benefit analyses are necessary to convince farmers to adopt UAS or to request service from consultation companies who provide UAS images.
If farmers are to be integrated in the image interpretation and processing stages, it may be necessary to set up a web-based GIS for image sharing, information input and processing. Farmers might lack the necessary skills to extract reliable information from UAS images but the remote sensing scientists may be unfamiliar with the field and crop conditions. Therefore, an approach that integrates the skills of both remote-sensing scientists and farmers could be used to determine the most appropriate means for exploiting a new PA technology while promoting sustainable development. The remote sensing scientists could collect images using sensors on-board UAS and work closely with the farmers to monitor crop growth. The feedback from farmers would be essential for identifying the most optimal use of this emerging technology. Most importantly, this co-operation may significantly reduce the time needed for information extraction from the imagery and, consequently, for the farmers to adjust their agricultural practices.
Conclusions
Over the past decade there have been an increasing number of examples of UAS application in environmental monitoring and PA. Applications of UAS remote sensing in PA include, but are not limited to, yield mapping, chemical content measurement, vigor mapping, vegetation stress monitoring and assessment of impacts of fertilizing on crop growth. However, there are still many significant shortcomings related to UAS remote sensing including high initial costs, platform reliability, sensor capability, and lack of standardized procedure to process large volumes of data. Moreover, strict aviation regulations and simple lack of interest from the farmers may hinder UAS adoption. Fortunately, it is expected that with the advancement of UAS platform technology, new camera designs, lower costs, improved image processing techniques and a greater number of experimental studies of UAS based remote sensing for agriculture application, there will be a greater appreciation of the benefits of these systems in PA and environmental monitoring. Moreover, it is suggested that if UAS aviation regulations could be relaxed and research scientists be more engaged with the farming community a greater adoption of UAS in PA may occur.
References
Aber, J. S., Aaviksoo, K., Karofeld, E., & Aber, S. W. (2002). Patterns in Estonian bogs as depicted in color kite aerial photographs. Suo, 53, 1–15.
Aber, J. S., Aber, S. W., Buster, L., Jensen, W. E., & Sleezer, R. O. (2009). Challenge of infrared kite aerial photography: A digital update. Kansas Academy of Science Transactions, 112, 31–39.
Aber, J. S., Marzolff, I., & Ries, J. B. (2010). Small-format aerial photography. Boston: Elsevier. 266.
Adrian, A. M., Norwood, S. H., & Mask, P. L. (2005). Producers’ perceptions and attitudes toward precision agriculture technologies. Computer and Electronics in Agriculture, 48, 256–271.
Amoroso, L., & Arrowsmith, R. (2000). Balloon photography of brush fire scars east of Carefree, AZ. Retrieved March 12, 2012 from http://activetectonics.asu.edu/Fires_and_Floods/10_24_00_photos/.
Aylor, D. E., Boehm, M. T., & Shields, E. J. (2006). Quantifying aerial concentrations of maize pollen in the atmospheric surface layer using remotely-piloted airplanes and Lagrangian stochastic modeling. Journal of Applied Meteorology and Climatology, 45, 1003–1015.
Bausch, W. C., & Khosla, R. (2010). QuickBird satellite versus ground-based multi-spectral data for estimating nitrogen status of irrigated maize. Precision Agriculture, 11, 274–290.
Beeri, O., & Peled, A. (2009). Geographical model for precise agriculture monitoring with real-time remote sensing. ISPRS Journal of Photogrammetry and Remote Sensing, 64, 47–54.
Beeri, O., Phillips, R., Carson, P., & Liebig, M. (2005). Alternate satellite models for estimation of sugar beet residue nitrogen credit. Agriculture, Ecosystems & Environment, 107, 21–35.
Berni, J. A. J., Zarco-Tejada, P. J., Suarez, L., & Fereres, E. (2009a). Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Transactions on Geoscience and Remote Sensing, 47, 722–738.
Berni, J. A. J., Zarco-Tejada, P. J., Suarez, L., Gonzalez-Dugo, V., & Fereres, E. (2009a). Remote sensing of vegetation from UAV platforms using lightweight multispectral and thermal imaging sensors. Retrieved March 12, 2012 from http://www.ipi.uni-hannover.de/fileadmin/institut/pdf/isprs-Hannover2009/Jimenez_Berni-155.pdf.
Blackmore, S. (2000). The interpretation of trends from multiple yield maps. Computers and Electronics in Agriculture, 26, 37–51.
Blackmore, S., Godwin, R. J., & Fountas, S. (2003). The analysis of spatial and temporal trends in yield map data over six years. Biosystems Engineering, 84, 455–466.
Castillejo-Gonzalez, I. L., Lopez-Granados, F., Garcia-Ferrer, A., Pena-Barragan, J. M., Jurado-Exposito, M., Orden, M. S., et al. (2009). Object- and pixel-based analysis for mapping crops and their agro-environmental associated measures using QuickBird imagery. Computers and Electronics in Agriculture, 68, 207–215.
Chandler, J., Fryer, J. G., & Jack, A. (2005). Metric capabilities of low-cost digital cameras for close range surface measurement. The Photogrammetric Record, 20, 12–26.
Clevers, J. G. P. W. (1988). The derivation of a simplified reflectance model for the estimation of leaf area index. Remote Sensing of Environment, 35, 53–70.
Colewell, R. N. (1956). Determining the prevalence of certain cereal crop diseases by means of aerial photography. Hilgardia, 26, 223–286.
Cook, S. E., & Bramley, R. G. V. (1998). Precision agriculture: Opportunities, benefits and pitfalls of site specific crop management in Australia. Australian Journal of Experimental Agriculture, 38, 753–763.
De Tar, W. R., Chesson, J. H., Penner, J. V., & Ojala, J. C. (2008). Detection of soil properties with airborne hyperspectral measurements of bare fields. Transactions of the ASABE, 51, 463–470.
Diker, K., Heermann, D. F., & Bordahl, M. K. (2004). Frequency analysis of yield for delineating yield response zones. Precision Agriculture, 5, 435–444.
Donoghue, D., Watt, P., Cox, N., & Wilson, J. (2006). Remote sensing of species mixtures in conifer plantations using LiDAR height and intensity data. International Workshop 3D remote sensing in Forestry. Retrieved March 12, 2012 form (http://www.rali.boku.ac.at/fileadmin/_/H857-VFL/workshops/3drsforestry/presentations/6a.5-donoghue.pdf).
Du, Q., Chang, N. B., Yang, C. H., & Srilakshmi, K. R. (2008). Combination of multispectral remote sensing, variable rate technology and environmental modeling for citrus pest management. Journal of Environmental Management, 86, 14–26.
Eisenbeiss, H. (2004). A mini unmanned aerial vehicle (UAV): system over and image acquisition. In: A. Gruen, Sh. Murai, T. Fuse, F. Remondino (Eds.). Proceedings of International Workshop on Processing and Visualization Using High-Resolution Imagery, XXXVI(5/W1), Pitsanulok, Thailand. CDROM. Retrieved March 12, 2012 from http://www.isprs.org/proceedings/XXXVI/5-W1/papers/11.pdf.
Enclona, E. A., Thenkabail, P. S., Celis, D., & Diekmann, J. (2004). Within-field wheat yield prediction from IKONOS data: A new matrix approach. International Journal of Remote Sensing, 25, 377–388.
Erickson, B. J., Johannsen, C. J., Vorst, J. J., & Biehl, L. L. (2004). Using remote sensing to assess stand loss and defoliation in maize. Photogrammetric Engineering and Remote Sensing, 70, 717–722.
Eugster, H., & Nebiker, S. (2007). Geo-registration of video sequences captured from Mini UAVs: Approaches and accuracy assessment. The 5th International Symposium on Mobile Mapping Technology, Padua, Italy. Retrieved March 12, 2012 from http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cts=1331769791050&ved=0CCYQFjAA&url=http%3A%2F%2Fwww.3dgi.ch%2Fpublications%2Feh%2F2007_MMT07_Padua_final.pdf&ei=rzFhT9LrN4aJtwe9w9W-BQ&usg=AFQjCNHlP4X-S3DkZib-OdlEap7T4JBtg.
Fisher, P. D., Abuzar, M., Rab, M. A., Best, F., & Chandra, S. (2009). Advances in precision agriculture in south-eastern Australia. I. A regression methodology to simulate spatial variation in cereal yields using farmers’ historical paddock yields and normalised difference vegetation index. Crop & Pasture Science, 60, 844–858.
Flowers, M., Weisz, R., & White, J. G. (2005). Yield-based management zones and grid sampling strategies: Describing soil test and nutrient variability. Agronomy Journal, 97, 968–982.
Godwin, R. J., Richards, T. E., Wood, G. A., Welsh, J. P., & Knight, S. M. (2003). An economic analysis of the potential for precision farming in UK cereal production. Biosystems Engineering, 84, 533–545.
Gomez, C., Rossel, R. A. V., & McBratney, A. B. (2008). Soil organic carbon prediction by hyperspectral remote sensing and field vis-NIR spectroscopy: An Australian case study. Geoderma, 146, 403–411.
Gomez-Candon, D., Lopez-Granados, F., Caballero-Novella, J. J., Gomez-Casero, M. T., Jurado-Exposito, M., & Garcia-Torres, L. (2011). Geo-referencing remote images for precision agriculture using artificial terrestrial targets. Precision Agriculture, 12, 876–891.
Gomez-Casero, M. T., Castillejo-Gonzalez, I. L., Garcia-Ferrer, A., Pena-Barragan, J. M., Jurado-Exposito, M., Garcia-Torres, L., et al. (2010). Spectral discrimination of wild oat and canary grass in wheat fields for less herbicide application. Agronomy for Sustainable Development, 30, 689–699.
Griffin, T. W., Lowenberg-Deboer, J., Lambert, D. M., Peone, J., Payne, T., & Daberkow, S. G. (2004). Adoption, profitability, and making better use of precision farming data. Staff paper No. 04–06 West Lafayette, IN, USA: Department of Agricultural Economics, Purdue University.
Gutierrez, P. A., Lopez-Granados, F., Jurado-Exposito, J. M. P. M., & Hervas-Martinez, C. (2008). Logistic regression product-unit neural networks for mapping Ridolfia segetum infestations in sunflower crop using multitemporal remote sensed data. Computers and Electronics in Agriculture, 64, 293–306.
Haboudane, D., Miller, J. R., Pattey, E., Zarco-Tejada, P. J., & Strachan, I. B. (2004). Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sensing of Environment, 90, 337–352.
Hardin, P. J., & Hardin, T. J. (2010). Small-scale remotely piloted vehicles in environmental research. Geography Compass, 4, 1297–1311.
Hardin, P., & Jackson, M. (2005). An unmanned aerial vehicle for rangeland photography. Rangeland Ecology & Management, 58, 439–442.
Hardin, P. J., Jackson, M. W., Anderson, V. J., & Johnson, R. (2007). Detecting squarrose knapweed (Centaurea virgata Lam. Ssp. Squarrosa Gugl.) using a remotely piloted vehicle: A Utah case study. GIScience & Remote Sensing, 44, 203–219.
Hardin, P. J., & Jensen, R. R. (2011). Small-scale unmanned aerial vehicles in environmental remote sensing: Challenges and opportunities. GIScience & Remote Sensing, 48, 99–111.
Hinkleya, E. A., & Zajkowski, T. (2011). USDA forest service-NASA: Unmanned aerial systems demonstrations-pushing the leading edge in fire mapping. Geocarto International, 26, 103–111.
Huang, Y., Lan, Y., Hoffmann, W. C., & Fritz, B. K. (2008). Development of an unmanned aerial vehicle-based remote sensing system for site-specific management in precision agriculture. In Proceedings of the 9th International Symposium on Precision Agriculture. Denver, CO. CDROM.
Hunt, E. R., Cavigelli, M., Daughtry, C. S. T., McMurtrey, J. E., & Walthall, C. L. (2005). Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precision Agriculture, 6, 359–378.
Hunt, E. R., Daughtry, C. S., Walthall, C. L., McMurtrey, J. E., & Dulaney, W. P. (2003). Agricultural remote sensing using radio-controlled aircraft. In: T. VanToai, D. Major, M. McDonald, J. Schepers & L. Tarpley (Eds.). Digital image and spectral techniques: Applications to precision agriculture and crop physiology. ASA Special Publications Number 66. Madison, WI, USA: American Society of Agronomy, pp. 197–205.
Hunt, E. R., Hively, W. D., Daughtry, C. S., McCarty, G. W., Fujikawa, S. J., Ng, T. L., Tranchitella, M., Linden, D. S., & Yoel, D. W. (2008). Remote sensing of crop leaf area index using unmanned airborne vehicles. In ASPRS Pecora 17 Conference Proceeding, Bethesda, MD: American Society for Photogrammetry and Remote Sensing. CDROM. Retrieved March 12, 2012 from http://www.asprs.org/a/publications/proceedings/pecora17/0018.pdf.
Hunt, E. R., Hively, W. D., Fujikawa, S. J., Linden, D. S., Daughtry, C. S. T., & McCarty, G. W. (2010). Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sensing, 2, 290–305.
Inoue, Y., Morinaga, S., & Tomita, A. (2000). A blimp-based remote sensing system for low-altitude monitoring of plant variables: A preliminary experiment for agricultural and ecological applications. International Journal of Remote Sensing, 21, 379–385.
Jackson, R. D. (1984). Remote sensing of vegetation characteristics for farm management. Proceedings of the Society of Photo-Optical Instrumentation Engineers, 475, 81–96.
Johnson, L. F., Herwitz, S. R., Lobitz, B. M., & Dunagan, S. E. (2004). Feasibility of monitoring coffee field ripeness with airborne multispectral imagery. Applied Engineering in Agriculture, 20, 845–849.
Jones, G. P., Pearlstine, L. G., & Percival, H. F. (2006). An assessment of small unmanned aerial vehicles for wildlife research. Wildlife Society Bulletin, 34, 750–758.
Kendoul, F., Lara, D., Fantoni-Coichot, I., & Lozano, R. (2007). Real-time nonlinear embedded control for an autonomous quadrotor helicopter. Journal of Guidance Control and Dynamics, 30, 1049–1061.
Laliberte, A. S., Herrick, J. E., & Rango, A. (2010). Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring. Photogrammetric Engineering and Remote Sensing, 76, 661–672.
Laliberte, A. S., & Rango, A. (2009). Texture and scale in object-based analysis of sub-decimeter resolution unmanned aerial vehicle (UAV) imagery. IEEE Transactions on Geoscience and Remote Sensing, Special Issue on UAV Sensing Systems in Earth Observation, 47, 761–770.
Laliberte, A. S., & Rango, A. (2011). Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands. GIScience & Remote Sensing, 48, 4–23.
Laliberte, A. S., Rango, A., & Fredrickson, E. L. (2005). Multi-scale, object-oriented analysis of QuickBird imagery for determining percent cover in arid land vegetation. In: 20th Biennial Workshop on Aerial Photography, Videography, and High Resolution Digital Imagery for Resource Assessment. Weslaco, TX. CDROM. Retrieved March 12, 2012 from https://jornada.nmsu.edu/bibliography/05-055Proc.pdf.
Laliberte, A. S., Rango, A., & Herrick, J. (2007). Unmanned aerial vehicles for rangeland mapping and monitoring: a comparison of two systems. In Proceeding of ASPRS 2007 Annual Conference. Tampa, FL. CDROM. Retrieved March 12, 2012 from http://www.asprs.org/a/publications/proceedings/tampa2007/0039.pdf.
Lamb, J. A., Anderson, J. L., Malzer, G. L., Vetch, J. A., Dowdy, R. H., Onken, D. S., et al. (1995). Perils of monitoring grain yield on-the-go. In P. C. Robert, R. H. Rust, & W. E. Larson (Eds.), Site-specific management for agricultural systems (pp. 87–90). Madison: American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.
Lamb, D. W., & Brown, R. B. (2001). Remote-sensing and mapping of weeds in crops. Journal of Agricultural Engineering Research, 78, 117–125.
Lamb, D. W., Frazier, P., & Adams, P. (2008). Improving pathways to adoption: Putting the right P’s in precision agriculture. Computers and Electronics in Agriculture, 61, 4–9.
Lambert, D., & Lowenberg-Deboer, J. (2000). Precision agriculture profitability review (p. 154). Purdue, USA: Site Specific Management Center.
Lan, Y., Huang, Y., Martin, D. E., & Hoffmann, W. C. (2009). Development of an airborne remote sensing system for crop pest management: System integration and verification. Transactions of the ASABE, 25, 607–615.
Lelong, C. C. D., Burger, P., Jubelin, G., Roux, B., Labbe, S., & Barett, F. (2008). Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots. Sensors, 8, 3557–3585.
Lelong, C. C. D., Pinet, P. C., & Poilvé, H. (1998). Hyperspectral imaging and stress mapping in agriculture: A case study on wheat in Beauce (France). Remote Sensing of Environment, 66, 179–191.
Lewis, G. (2007). Evaluating the use of a low-cost unmanned aerial vehicle platform in acquiring digital imagery for emergency response. In J. Li, S. Zlatanova, & A. Fabbri (Eds.), Geomatics solutions for disaster management (pp. 117–133). Berlin: Springer.
Long, D. S., Carlson, G. R., & DeGloria, S. D. (1995). Quality of field management maps. In P. C. Robert (Ed.), Proceedings of Site-Specific Management for Agriculture Systems (pp. 251–271). Madison: American Society of Agronomy.
Lopez-Lozano, R., Baret, F., de Cortazar-Atauri, I. G., Bertrand, N., & Casterad, M. A. (2009). Optimal geometric configuration and algorithms for LAI indirect estimates under row canopies: The case of vineyards. Agricultural and Forest Meteorology, 149, 1307–1316.
Lorenzen, B., & Jensen, A. (1989). Changes in leaf spectral properties induced in barley by cereal powdery mildew. Remote Sensing of Environment, 27, 201–209.
MacArthur, E. Z., MacArthur, D., & Crane, C. (2005). Use of cooperative unmanned air and ground vehicles for detection and disposal of mines. Proceedings of SPIE-The International Society for Optical Engineering, 5999, 94–101.
Maldonado-Ramirez, S. L., Schmale, D. G., Shields, E. J., & Bergstrom, G. C. (2005). The relative abundance of viable spores of Gibberella zeae in the planetary boundary layer suggests the role of long-distance transport in regional epidemics of Fusarium head blight. Agricultural and Forest Meteorology, 132, 20–27.
Malthus, T. J., & Maderia, A. C. (1993). High resolution spectroradiometry: Spectral reflectance of field bean leaves infected by Botrytis fabae. Remote Sensing of Environment, 45, 107–116.
McBratney, A., Whelan, B., & Ancev, T. (2005). Future directions of precision agriculture. Precision Agriculture, 6, 7–23.
McBratney, A. B., Whelan, B. M., & Shatar, T. (1997). Variability and uncertainty in spatial, temporal and spatio-temporal crop yield and related data. In: Precision agriculture: Spatial and temporal variability of environmental quality. Chichester: Wiley, pp. 141–160
McNairn, H., & Brisco, B. (2004). The application of C-band polarimetric SAR for agriculture: A review. Canadian Journal of Remote Sensing, 30, 525–542.
Monmonier, M. (2002). Aerial photography at the Agricultural Adjustment Administration: Acreage controls, conservation. Photogrammetric Engineering & Remote Sensing, 68, 1257–1261.
Moran, M. S., Inoue, Y., & Barnes, E. M. (1997). Opportunities and limitation for image-based remote sensing in precision crop Management. Remote Sensing of Environment, 61, 319–346.
Murakami, E., Saraiva, A. M., Ribeiro, L. C. M., Cugnasca, C. E., Hirakawa, A. R., & Correa, P. L. P. (2007). An infrastructure for the development of distributed service-oriented information systems for precision agriculture. Computers and Electronics in Agriculture, 58, 37–48.
Pena-Barragan, J. M., Lopez-Granados, F., Garcia-Torres, L., Jurado-Exposito, M., de la Orden, M. S., & Garcia-Ferrer, A. (2008). Discriminating cropping systems and agro-environmental measures by remote sensing. Agronomy for Sustainable Development, 28, 355–362.
Price, P. (2004). Spreading the PA message. Ground Cover, Issue 51 Grains Research and Development Corporation: Canberra, Australia Capital Territory, Australia.
Primicerio, J., Gennaro, S. F. D., Fiorillo, E., Genesio, L., Lugato, E., Matese, A., et al. (2012). A flexible unmanned aerial vehicle for precision agriculture. Precision Agriculture (Online first),. doi:10.1007/s11119-012-9257-6.
Quilter, M. C. (1997). Vegetation monitoring using low altitude, large scale imagery from radio controlled drones. PhD dissertation, Department of Botany and Range Science, Brigham Young University, Provo, UT, USA
Quilter, M. C., & Anderson, V. J. (2000). Low altitude/large scale aerial photographs: A tool for range and resource managers. Rangelands, 22, 13–17.
Quilter, M. C., & Anderson, V. J. (2001). A proposed method for determining shrub utilization using (LA/LS) imagery. Journal of Range Management, 54, 378–381.
Rango, A., & Laliberte, A. S. (2010). Impact of flight regulations on effective use of unmanned aerial vehicles for natural resources applications. Journal of Applied Remote Sensing, 4, 043539.
Rango, A., Laliberte, A. S., Herrick, J. E., Winters, C., Havstad, K., Steele, C., et al. (2009). Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management. Journal of Applied Remote Sensing, 3, 033542.
Rao, N. R., Garg, P. K., & Ghosh, S. K. (2007). Development of an agricultural crops spectral library and classification of crops at cultivar level using hyperspectral data. Precision Agriculture, 8, 173–185.
Rao, N. R., Garg, P. K., Ghosh, S. K., & Dadhwal, V. K. (2008). Estimation of leaf total chlorophyll and nitrogen concentrations using hyperspectral satellite imagery. Journal of Agricultural Science, 146, 65–75.
Robert, P.C. (1996). Use of remote sensing imagery for precision farming. In: Proceedings of 26th International Symposium on Remote Sensing of Environment and 18th symposium of the Canadian Remote Sensing Society, Ontario, Canada, pp. 596–599.
Robertson, M., Carberry, P., & Brennan, L. (2007). The economic benefits of precision agriculture: cast studies from Australia grain farms. Retrieved March 12, 2012 from http://www.grdc.com.au/uploads/documents/Economics%20of%20Precision%20agriculture%20Report%20to%20GRDC%20final.pdf.
Nebiker, S. Annen, A., Scherrer, M., & Oesch, D. (2008). A light-weight multispectral sensor for micro UAV: Opportunities for very high resolution airborne remote sensing. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B1., pp. 1193–1200
Schmale, D. G., Dingus, B. R., & Reinholtz, C. (2008). Development and application of an autonomous aerial vehicle for precise aerobiological sampling above agricultural fields. Journal of Field Robotics, 25, 133–147.
Scotford, I. M., & Miller, P. C. H. (2005). Applications of spectral reflectance techniques in Northern European cereal production: A review. Biosystems Engineering, 90, 235–250.
Seang, T. P., & Mund, J. (2006). Balloon based geo-referenced digital photo technique: a low cost high-resolution option for developing countries. In Proceedings of XXIII FIG Congress. Munich, Germany. CDROM. Retrieved March 12, 2012 from http://www.fig.net/pub/fig2006/papers/ts73/ts73_02_mund_peng_0425.pdf.
Seelan, S. K., Laguette, S., Casady, G. M., & Seielstad, G. A. (2003). Remote sensing applications for precision agriculture: A learning community approach. Remote Sensing of Environment, 88, 157–169.
Shou, L., Jia, L. L., Cui, Z. L., Chen, X. P., & Zhang, F. S. (2007). Using high-resolution satellite imaging to evaluate nitrogen status of winter wheat. Journal of Plant Nutrition, 30, 1669–1680.
Silva, C. B., Vale, S. M. L. R., Pinto, F. A. C., Muller, C. A. S., & Moura, A. D. (2007). The economic feasibility of precision agriculture in Mato Grosso do Sul State, Brazil: A case study. Precision Agriculture, 8, 255–265.
Song, X., Wang, J., Huang, W., Liu, L., Yan, G., & Pu, R. (2009). The delineation of agricultural management zones with high resolution remotely sensed data. Precision Agriculture, 10, 471–487.
Stafford, J. V. (2000). Implementing precision agriculture in the 21st century. Journal of Agricultural Engineering Research, 76, 267–275.
Sugiura, R., Ishii, K., & Noguchi, N. (2004). Remote sensing technology for field information using an unmanned helicopter. In Proceedings of Automation Technology for Off-road Equipment. Paper No. 701P1004. ASABE, St Joseph, MI, USA.
Sugiura, R., Noguchi, N., Ishii, K., & Terao, H. (2002). The development of remote sensing system using unmanned helicopter. In Proceedings of Automation Technology for Off-road Equipment, 120–128. Paper No. 701P0502. ASABE, St Joseph, MI, USA.
Sullivan, D. G., Shaw, J. N., & Rickman, D. (2005). IKONOS imagery to estimate surface soil property variability in two Alabama physiographies. Soil Science Society of America Journal, 69, 1789–1798.
Swain, K. C., Jayasuriya, H. P. W., & Salokhe, V. M. (2007). Suitability of low-altitude remote sensing images for estimating nitrogen treatment variations in rice cropping for precision agriculture adoption. Journal of Applied Remote Sensing, 1, 013547.
Swain, K. C., Thomson, S. J., & Jayasuriya, H. P. W. (2010). Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Transactions of the ASABE, 53, 21–27.
Tenkorang, F., & DeBoer, L. (2007). On-farm profitability of remote sensing in agriculture. Journal of Terrestrial Observation, 1, 50–59.
Tomlins, G. F. (1983). Some considerations in the design of low-cost remotely-piloted aircraft for civil remote sensing applications. The Canadian Surveyor, 37, 157–167.
Torbett, J. C., Roberts, R. K., Larson, J. A., & English, B. C. (2008). Perceived improvements in nitrogen fertilizer efficiency from cotton precision farming. Computers and Electronics in Agriculture, 64, 140–148.
Vericat, D., Brasington, J., Wheaton, J., & Cowie, M. (2008). Accuracy assessment of aerial photographs acquired using lighter-than-air blimps: Low-cost tools for mapping river corridors. River Research and Applications, 25, 985–1000.
Warren, G., & Metternicht, G. (2005). Agricultural applications of high-resolution digital multispectral imagery: Evaluating within-field spatial variability of canola (Brassica napus) in Western Australia. Photogrammetric Engineering and Remote Sensing, 71, 595–602.
Whipker, L. D., & Akridge, J. T. (2009). Precision agricultural services dealership survey results. Retrieved March 12, 2012 from http://www.agecon.purdue.edu/cab/research_articles/articles/2009_crop_life_precision_report.pdf.
Wu, C., Niu, Z., Tang, Q., & Huang, W. (2008). Estimating chlorophyll content from hyperspectral vegetation indices: Modeling and validation. Agricultural and Forest Meteorology, 148, 1230–1241.
Wu, J. D., Wang, D., & Bauer, M. E. (2007a). Assessing broadband vegetation indices and QuickBird data in estimating leaf area index of corn and potato canopies. Field Crops Research, 102, 33–42.
Wu, J. D., Wang, D., & Rosen, C. J. (2007b). Comparison of petiole nitrate concentrations, SPAD chlorophyll readings, and QuickBird satellite imagery in detecting nitrogen status of potato canopies. Field Crops Research, 101, 96–103.
Wundram, D., & Loffler, J. (2007). Kite aerial photography in high mountain ecosystem research. Grazer Schriften der Geographie und Raumforschung, 43, 15–22.
Xiang, H., & Tian, L. (2011). Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform. Biosystems Engineering, 108, 104–113.
Yang, C., Bradford, J. M., & Wiegand, C. L. (2001). Airborne multispectral imagery for mapping variable growing conditions and yields of cotton, grain sorghum, and corn. Transactions of the ASAE, 44, 1983–1994.
Yang, C. H., Everitt, J. H., & Bradford, J. M. (2006). Comparison of QuickBird satellite imagery and airborne imagery for mapping grain sorghum yield patterns. Precision Agriculture, 7, 33–44.
Zarco-Tejada, P. J., Gonzalez-Dugo, V., & Berni, J. A. J. (2012). Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sensing of Environment, 117, 322–337.
Zhang, J. H., Wang, K., Bailey, J. S., & Wang, R. C. (2006). Predicting nitrogen status of rice using multispectral data at canopy scale. Pedosphere, 16, 108–117.
Zhao, D. H., Huang, L. M., Li, J. L., & Qi, J. G. (2007). A comparative analysis of broadband and narrowband derived vegetation indices in predicting LAI and CCD of a cotton canopy. ISPRS Journal of Photogrammetry and Remote Sensing, 62, 25–33.
Zhou, G. (2009). Near real-time ortho rectification and mosaic of small UAV flow for time-critical event response. IEEE Transactions on Geoscience and Remote Sensing, 47, 739–747.
Acknowledgments
This research was supported by a Grant (project #920161) provided to John M. Kovacs from the Northern Ontario Heritage Fund Corporation of Canada.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, C., Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: a review. Precision Agric 13, 693–712 (2012). https://doi.org/10.1007/s11119-012-9274-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11119-012-9274-5