Keywords

1 Introduction

Newsom (1974) divided the history of cotton insect management into four periods: pre-1892 –the pre-boll weevil era, 1892–1917—the early boll weevil era, 1917–1945—the calcium arsenate era, and 1945 forward—the synthetic organic insecticide era. Perkins (1980) later sub-divided the synthetic organic insecticide era. Perkins recognized 1945–1955 as the era of euphoria and the crisis of residues; 1954–1972 as the era of confusion, environmental crisis and the beginning of new directions; and 1968 forward as the era of changing paradigms (IPM). Since 1996, it has become clear that American agriculture has transitioned to another era, the era of genetically modified crops.

The history of pest management since the late 1800s is a repeated cycle of pest intensification, development of innovative and effective technology, enthusiasm and over-use of the powerful new technology, followed by the development of problems with the technology. The problems that arose that were often associated with failure of growers to integrate the tactics into multi-tactic IPM systems. The historic trend has been for producers to rely heavily on a single control tactic. Often, this has resulted in the development of environmental problems and placed powerful selection pressure on pest populations. Over-use of single tactics has led to premature evolution of resistance and failure of the pest management technology. Pest resistance, resurgence of secondary pests, and loss of natural enemies have resulted in environmental and human health impacts, and economic losses.

This chapter discusses the history of pest management in the southern United States of America (U.S.A). It focuses on our failure to integrate pest management tactics in the past and the need to do so in the future to meet the challenges of feeding and clothing a rapidly growing world population. It also discusses the evolution of IPM programs in the era of genetically modified crops. It discusses increasing use of preventative tactics implemented on an area-wide basis and the impact of these changes on the numbers of agricultural professionals available. Finally, the chapter discusses the future consequences and perils of failing to counter the trend of the diminishing numbers of crop production professionals supporting farmers in the era of genetically modified crops.

2 Before Boll Weevil– Pre 1892

Cotton production in North America began about 1600 (Handy 1896) . Donnell (1872) reported that the country was supplied with cloth from cotton grown in Maryland, Delaware and New Jersey during the American War of Independence. In 1796, President George Washington signed the patent for Eli Whitney’s cotton gin (Donnell 1872; Linder 1954) , making the production of upland cotton commercially feasible (Anonymous 1930).

Production of cotton in the American South grew rapidly during the period 1840–1860 (Trelogan 1969) . By 1849, cotton was the most important agricultural export, and income from cotton sales paid for two-thirds of all US imports (Anonymous 1850; Phillips 1850; Haney et al. 1996) . By 1850, 85 % of the world’s cotton was produced in the American South. In 1860, America produced 2 million bales of cotton. Eighty percent of cotton spun in United Kingdom (U.K) mills came from the southern U.S. The American Civil War severely disrupted cotton production and marketing. During the war, United Kingdom mills received only two percent of their cotton fiber from southern states. The American Civil War ended in 1865 and by 1876, the cotton industry in the South had recovered sufficiently to supply 62 % of the cotton used by mills in the U.K (Anonymous 1877; Haney 2001) .Westward population movement after the Civil War, aided by development of railroads, greatly expanded cotton production—especially in Texas. By the end of the 19th century, any threat to the cotton industry was a clear threat to the U.S. economy. Cotton was central to the economies of southern states which were struggling to recover from the devastation of the war (Haney 2001) .

3 Initial Boll Weevil Years—1892–1917

It was into this milieu that the boll weevil (Anthonomus grandis) arrived, crossing the Rio Grande into South Texas about 1892 (Newell 1904) . Yield losses in cotton fields near Brownsville and San Diego, Texas exceeded 90 % by 1894 (Townsend 1895) . Moving at an average of 80–100 km per year, the weevil had infested all of the U.S. cotton belt east of the Texas High Plains by 1922 (Coad et al. 1922) . Cotton yield losses during these years varied between 20 and 80 % (Worsham 1914; Lewis 1920; Isley and Baerg 1924; Thomas 1929; Coad 1930; Wagner 1999) . In Georgia, Soule (1921, p. 16) spoke for the all southern U.S. cotton producing communities, “The boll weevil has disturbed our economic situation more than any other single factor since the conclusion of the Civil War; it is a pest of as great a magnitude as any which afflicted the Egyptians in the olden days.”

Historically, the immigration of the boll weevil into and through the South had the most significant impact of any invasive pest in the history of the southern USA. It resulted in the establishment of entomology as a discipline and departments in southern universities. The establishment by Dr. Seaman Knapp of a boll weevil management method demonstration as one of the earliest actions of the Cooperative Extension Service substantiates the importance of boll weevil management in the founding of Extension (Frisbie 1993) (Fig. 5.1).

Fig. 5.1
figure 1

Migration of the boll weevil across the southern USA. Coad et al. 1922

Initially, farmers were defenseless and the boll weevil caused extensive damage . Public sector entomologists quickly responded. Early biological observations formed the basis for cultural control tactics which limited boll weevil losses . Observations and initial cultural management suggestions were made by C.H.T. Townsend, L.O. Howard, E.A Swartz and C.L. Marlatt. This led to the development of a suite of cultural management practices, many of which were developed by F.W. Mally, W.D. Hunter, W.E. Hinds and S.A. Knapp. Mally recognized the value of earliness (Mally 1901) and stalk destruction (Walker and Niles 1971; Walker 1984; Klassen and Ridgway 2001) . Hunter (1904) found that the application of fertilizer could aid in the production of an early crop, thereby avoiding severe late season boll weevil damage .

Modification of row width was recommended—first wider rows to allow greater light penetration and desiccation of boll weevil immature in squares (pre-bloom flower buds) on the ground; and later, narrow rows to promote earliness (Mally 1901; Cook 1924; Hinds 1928; Ware 1929, 1930) . Government entomologists promoted a program of cultural tactics, termed the Government Method, which were incompletely adopted because stalk destruction—a key component of the strategy—was very difficult to accomplish in the era before mechanization (Helms 1977; Wagner 1980; Walker 1984; Haney 2001; Stavinoha and Woodward 2001) . Newell and Paulsen (1908) proposed defoliation of the cotton crop to slow late season boll weevil losses. The development of the V-shaped stalk cutter (Anonymous 1911) aided growers in accomplishing stalk destruction. In 1922, the development of tractors equipped with power-take-off and stalk shredders greatly improved farmers’ ability to destroy stalks in a timely manner (Williams 1987) .

In the years before effective insecticides were available, the primary focus was on cotton varieties that could escape devastating late season boll weevil populations through early fruit production and maturation . Early spring planting of varieties selected for rapid maturation was recommended (Cook 1906, 1911; Bennett 1908) . Mally’s concept of a short season approach to cotton production continued to be an area of emphasis in Texas for many years (Niles 1970; Namken and Hielman 1973) . Cotton breeders selected varieties for other boll weevil-resistant traits such as thickened boll walls (Harned 1910) , red leaves and stems (Isley 1928) and strap-like, frego bracts which permitted light to pass through the bracts and reach the squares and bolls, inhibiting weevil damage (Jones et al. 1964; Lincoln and Waddle 1965) .

4 Calcium Arsenate Era—1917–1945

From the first appearance of boll weevil, various concoctions were used in attempts to poison them . Lime, ashes, sulfur, Paris green, London purple, lead arsenate and many other concoctions were used (Parencia et al. 1983; Haney et al. 1996; Haney 2001) . Paris green was effective against the cotton leafworm, Alabama argillacea, but not boll weevil. Sulfur was effective on sucking insect pests such as tarnished plant bug, Lygus lineolaris and cotton fleahopper, Pseudatomoscelis seriatus, but once again, ineffective against the weevil (Parencia et al. 1983) . In the early 1920s, calcium arsenate was found to be effective against boll weevil. And, in the 1920s, airplanes were found to be a very efficient means of applying insecticides. By 1931, aerial application of insecticides was widely accepted (Post 1924; Hinds 1926; Parencia 1978) .

After the discovery of practical application methods for calcium arsenate, entomologists largely abandoned development and implementation of ecologically-based management methods and concentrated on research and extension programs involving chemical control methods (Smith et al. 1976) . The Georgia State Bureau of Entomology recommended calcium arsenate treatments every 4–6 days (9–10 applications per season) to control boll weevil (Warren and Williams 1922) . Farmers adopted chemical control and they too largely abandoned ecologically-based tactics to manage boll weevil and other cotton pests. Insecticide-dependent cotton production systems quickly became the principal means of protecting cotton. Isley, Baerg and Sanderson promoted use of insecticides only as necessary to supplement cultural and other management methods (Isley and Baerg 1924; Baerg et al. 1938) , but dependence primarily on calcium arsenate continued for decades (Parencia 1978). Injurious populations of cotton aphid, Aphis gossypii, and bollworm, Helicoverpa zea, were associated with repeated use of calcium arsenate (Bishop 1929; Fletcher 1929; Sherman 1930; Baerg et al. 1938; Gaines 1942; Ewing and Ivy 1943) . Nicotine or hydrated lime sulfur was sometimes mixed with calcium arsenate to provide control of mixed populations of cotton aphids, cotton fleahoppers, and bollweevils (Parencia et al. 1983).

In the 1920s and 1930s, Dr. Dwight Isley’s work in Arkansas stood out as one of the earliest examples of what would later be called integrated pest management (IPM) . An advocate of the Government Method, Isley worked to encourage farmers to integrate cultural and biological control with judicious insecticide use . He used small, early planted trap plantings of cotton to attract boll weevils which were then controlled with insecticides without disrupting natural control on whole fields . He advocated scouting and the use of economic thresholds to determine when to treat for weevils and other cotton pests. And, he showed that early-season spot-treatment of heavily infested areas of cotton fields was effective in reducing damage from boll weevils . Integrating the cultural controls espoused by the Government Method with natural biological control and insecticides, Isley was ahead of his time and laid the early foundations for IPM systems in the United States (Isley 1933; Johnson and Martin 2001; Klassen and Ridgway 2001) .

5 Synthetic Organic Insecticide Era—1945–1996

5.1 Euphoria and the Crisis of Residues—1945–1955

The discovery and development of synthetic organic insecticides in the 1940s and 1950s revolutionized pest control in the Southern U.S.A. The synthetic organochlorine insecticides quickly replaced calcium arsenate on cotton (Parencia et al. 1983). BHC, aldrin, dieldrin, chordane and heptachlor were effective against boll weevils, but not against bollworms. When mixed with DDT, both weevils and worms were controlled. Toxaphene and endrin were effective against both pests. Soon, insecticides from the organophosphate class of chemistry became available and growers quickly began using them along with organochlorines to control insect pests. Methyl parathion, azinphosmethyl, demeton and EPN were some of the organophosphate insecticides used in cotton. Carbamate insecticides such as carbaryl were developed and used as well (Parencia et al. 1983) .

Emulsifiable concentrate (EC) insecticide formulations were developed in 1948 (Parencia et al. 1983). EC formulations allowed farmers and aerial applicators to conveniently mix insecticides with water and apply them to crops in low-pressure, low-volume sprays. Foliar sprays were a significant improvement from both efficacy and environmental contamination standpoints over more drift-prone dust formulations. In the late 1940s and 1950s the standard approach to controlling pests became spraying weekly from squaring to near harvest (Whitcomb 1970; Newsom 1970) . This approach was accepted by most entomologists of the day (Rainwater 1952; Gaines 1952, 1957; Curl and White 1952; Ewing 1952; Smith et al. 1976) . The number of applications for cotton pests ranged from one or fewer per year in northern, dryland production areas to 18 or more in warmer, high-rainfall, and irrigated regions (Smith et al. 1964; Haney et al. 1996; Barker 2001; Boyd 2001) . After World War II cotton became the most heavily insecticide treated crop in the U.S.A (ARS 1976; Botrell 1983) . By the 1950s and 1960s, one third of the insecticides used in American agriculture were used on cotton (Brazzel et al. 1961; Knipling 1971; Perkins 1980) . The majority of this insecticide use occurred in the southern U.S.A.

Overuse of synthetic organic insecticides followed the pattern seen after development and widespread adoption of calcium arsenate. The availability of highly effective insecticides generated exaggerated optimism among cotton growers in their new-found power to control pests with synthetic organic insecticides (Barducci 1972; Adkisson et al. 1982; Gould 2010; Tabashnik and Gould 2012) . Grower optimism quickly led to over-use of the very effective, but largely single tactic, synthetic organic insecticide-based approach to pest control (Smith and Allen 1954; Stern 1969; Adkisson 1969, 1971, 1972; Smith and van den Bosch 1967; van den Bosch et al. 1971; Smith 1969, 1970, 1971; Doutt and Smith 1971; Newsom 1970) . Unfortunately, chemical control methods were not often integrated with cultural and biological control methods, but instead supplanted them (Smith et al. 1976). Most major cotton growing areas—which were plagued with severe insect pest problems—came under a heavy blanket of insecticide (Smith et al. 1976) . Optimism, over-use, and failure to integrate insecticides with ecologically-based pest management strategies were patterns which would be repeated again and again as each new technology became available. Each time, reliance on single-tactic pest management practices—even when several modes of action have been used—has been unsustainable (Stern et al. 1959; Metcalf and Luckman 1982; Persley 1996; Kogan 1998; Benedict and Ring 2004) .

5.2 Confusion, Crisis of the Environment and Beginning of New Directions—1954–1972

Multiple concerns soon began to develop as a consequence of over-reliance on insecticides in the late 1940s and early 1950s . The confidence of cotton growers in the southern U.S.A. in insecticides as the solution to their cotton insect pest problems was shaken by the discovery of high levels of resistance to organochlorine insecticides in the boll weevil in Louisiana in 1954 (Roussel and Clower 1955) and in Texas the following year (Walker et al. 1956) . Grower confidence was further weakened by the development of resistance to organochlorine insecticides (DDT) in the tobacco budworm and bollworm in the early 1960s (Brazzel 1963; Graves et al. 1963) , and further loss of confidence in the system occurred with the development of organophosphate (methyl parathion) resistance in the tobacco budworm six years later in Texas (Nemec and Adkisson 1969; Ridgway and Lloyd 1983) . Resistance was occurring in other pests on other crops as well. Banks grass mite, Oligonychus pratensis, on grain sorghum and corn became resistant to multiple miticides in the late 1960s and early 1970s on the Texas High Plains (Ward et al. 1972) . By 1983, Parencia and co-workers noted that 25 insects and spider mites that attack cotton had developed resistance to organochlorine insecticides. At least one resistant pest species was found in each cotton producing state .

The increasing cost of the single-tactic approach of pest control was exacerbated by insecticide resistance, pest resurgence and secondary pest outbreaks (Smith and Allen 1954; Stern 1969; Adkisson 1972; Bottrell 1983) . Control costs were an increasingly important concern and were directly related to grower over-reliance on insecticides. The National Cotton Council of America estimated the cost of insecticides and application on cotton at $260 million annually for 1970–1972 (Eichers et al. 1978) . An estimated 64.1 million pounds (29.1 million kg) of insecticide were applied to control cotton insects in the United States that year (Parencia et al. 1983) .

The environmental costs of the heavily insecticide-driven pest management system on cotton and other southern crops were brought into national focus with the publication of Silent Spring by Rachel Carson in 1962. Her book marked the beginning of the environmental movement in the United States and led to a President’s Science Advisory Committee study and special report in 1963. The report found fault with a number of crop production chemicals (Smith et al. 1976). Environmental concerns led to the formation of the U.S. Environmental Protection Agency (EPA) in 1970. EPA banned the use of DDT—the insecticide at the center of the controversy—in 1972 (Parencia et al. 1983).

It was into these tumultuous times—highlighted by public and agricultural concerns about crop protection, production costs and the environment—that the southern corn leaf blight epidemic broke in 1970. Male sterile hybrid seed corn production techniques rendered 85 % of the U.S. corn crop vulnerable to the Bipolaris maydis fungus. Southern corn leaf blight destroyed 15 % of U.S. corn production in 1970 (Tatum 1971; NAS 1972; Ulstrup 1972) . The southern corn leaf blight epidemic further increased concerns about modern agricultural methods and the stability of the food supply.

As pest and pesticide related problems continued, experts reviewed and debated the best course of action (Perkins 1983) . The approach now known as integrated pest management (IPM) was judged the most likely to succeed (NRC 1981; Bottrell 1983) . Over time, the IPM approach, which relied upon a broad suite of techniques—intelligent use of cultural practices, cultivars with resistance to pests, biological control, crop monitoring (scouting), and judicious use of pesticides only when pests reached economic thresholds—was embraced by public sector crop protection specialists . Soon—with the demonstrated success of the IPM approach—farmers and consultants adopted it as well.

Modern ecologically-based IPM arose from the observations and strategies of Townsend, Howard, Schwartz, Marlatt, Mally, Hunter, Hinds, Knapp, Coad and others. Their work helped cotton growers in the years just before and shortly after the turn of the 20th century to avoid some of the destruction caused by the boll weevil. Conceptually, their work laid the foundation for the development of IPM in the U.S. Other scientists who worked on boll weevil—Isley, Baerg, Sanderson and others—further developed IPM concepts as they began integrating ecologically-based strategies with insecticide-based strategies. Crop scouting, treatment thresholds, trap crops, spot treatments and conservation of natural enemies were products of their research and extension work.

The concept of integrated control was first articulated by Smith and Allen (1954) (Smith et al. 1976). Stern et al. (1959) is widely credited with having first provided the theoretical basis and applied methodology for holistic IPM (Castle and Naranjo 2009) . They developed the theoretical basis for control decisions and popularized the concepts of the economic injury level and the economic threshold. The integrated control concept was later broadened to include all pest management methods (Smith and Reynolds 1965). Still later it was extended to include management of all classes of pests—plant pathogens, insects, nematodes and weeds (Smith et al. 1976). Modern multidisciplinary IPM is developed and implemented by teams of scientists who operate with a holistic view of pest management problems and tactics. IPM teams operate most effectively when they work and think in ways that consider the agro-ecosystem and the pests within it from a broad ecological perspective. From this perspective, they are able to develop multidisciplinary management programs featuring ecologically-based solutions that address primary pest concerns without damaging systems, keeping other pests in check, causing unnecessary environmental damage, or limiting agricultural production (Smith et al. 1976).

5.3 Changing Paradigms—1968–1996

The period of nearly three decades—1968–1996—was a time of concentrated public and private sector investment in IPM. During this period, grower adoption of IPM provided benefits both on the farm and to society in the South and throughout the U.S.A.

Scouting services were embryonic in the 1940s and 1950s. They consisted primarily of checking to see if the insecticides that had been applied had worked . Little attention was given to thresholds, beneficial insects, resistant varieties, cultural practices, etc. As resistant insect pests evolved and insecticide costs increased, producers became more aware of the need for professionals to help them make decisions (Head 1983) . State Cooperative Extension Services initial efforts to initiate IPM programs in cotton began in 1967 (Young 1983; Canerday 1983) . Extension agents—deployed at the 1 to 5 county level—developed integrated pest management programs which emphasized beneficial insects, use of cultural practices, diapause and overwintering boll weevil control, individual field scouting, and use of selective insecticides only when economic thresholds were met or exceeded . Federal funding for research and extension IPM programs was begun in 1972 with the Huffaker Project—1972–1978 (Huffaker and Smith 1980) . It made organizing and implementing pilot IPM programs across the cotton belt possible. Further funding in 1975 allowed for program expansion. EPA, NSF and USDA funded the 17 university Consortium for IPM, 1979–1985, and further developed state IPM research and extension efforts (Frisbie 1985a) . Farmers rapidly adopted IPM programs and accepted guidance from extension agents (Young 1983) . Scouted cotton acreage increased rapidly across the cotton belt (Lambert 1983; Canerday 1983) .

Research teams quickly developed and improved IPM tactics and systems. Economic thresholds, monitoring methods, pest resistant crops, pest suppression systems, crop and pest modeling and forecasting and improved biological control techniques were achievements of the research efforts (Frisbie 1985a) .

Extension teams provided grower funded scouting programs which informed producers about pest populations and natural enemy levels in individual fields, and informed farming communities about pest and natural enemy trends. They worked with pest management technologies (pesticides, pest resistant crop technologies, cultural controls, biological controls, etc.) and demonstrated the best use of technologies for local crop production systems. Their work emphasized integration of ecologically-based and pesticide-based technologies with goals of reducing the environmental and human health risks associated with crop production, and improving farm profits.

As a result, insecticide use began to decline (Lambert 1983; Adkisson et al. 1985) and the number of scouted cotton fields increased (Corbet 1981; Pimentel et al. 1992; Parvin et al. 1994) . Participating farmers realized higher yields, lower risks and greater profits ~$333 per hectare ($135 per acre) compared with non-participants in Mississippi (Parvin et al. 1994).

The investment of Federal funds in state extension IPM programs had a positive impact on the crop consulting industry. In 1972 there were an estimated 61 crop consultants practicing in the entire cotton belt. By 1982 the number of practicing consultants had increased dramatically to an estimated 571. Mississippi, Texas, California and Louisiana had greater numbers of consultants than other cotton belt states. Sixty-three former extension employees were working as consultants (Head 1983) . Acres scouted by consultants had grown from an estimated 401,500 (~162,481 ha) in 1972 to 2.2 million (~0.9 million ha) in 1982 (Lambert 1983) . Producer support for private consultants increased from $430,000 in 1972 to ~$7 million in 1980 (Blair 1983). By 1983 a substantial portion of U.S. cotton land was regularly monitored by private, college-trained pest management consultants who offered a wide range of services including soil fertility analysis and recommendations, crop variety selection, pest advice, pesticide application and alternate control methods (Bottrell 1983). By 1983, an estimated 2.75 million hectares (6.8 million acres) of cotton were in either a private or university sponsored IPM program. The grower cost for IPM was estimated at $14.3 million per year, while the economic benefit was estimated to be greater than $133 million per year—$9.30 for every dollar invested (Smith 1983) .

Fuchs et al. (1997) conducted an extensive survey of Texas producers on their adoption and use of IPM. The survey team received 1,552 responses. Sixty-four percent of growers met the definition of IPM users (pre-determined by the survey). Farmers managing 68 % of the land used survey-defined IPM practices to suppress pests. Eighty-eight percent of farmers used economic thresholds, and 84 % of acres were scouted. Fifty-one percent of acres were treated and 69 % of growers considered the impact of treatment on natural enemy populations before they applied an insecticide. Thirty-seven percent of all insecticide applications targeted boll weevil and 36 % targeted bollworms (a total of 73 % for of treatments targeting either boll weevils or bollworms). Cotton fleahopper was the target for 10 % of the insecticide treatments, while aphids were the target for 8 % and thrips were the target for 9 % of the treatments .

During its first 50 years, IPM in cotton was predominantly a field-based approach with monitoring and decision-making conducted on a field-by-field basis (Brewer and Goodell 2012) . Notably, area-wide IPM also had its early roots in the struggle to manage boll weevil in the early years of the twentieth century. Mally and other early scientists were proponents of area-wide stalk destruction for boll weevil population management as a part of the suite of management tactics early farmers called the Government Method. Their concept of area-wide pest management for cotton—area-wide stalk destruction—is still mandated by state law in many southern states and is practiced to this day. Selection of cotton varieties for earliness and use of early fruiting varieties on an area-wide basis along with cultural practices to promote earliness—tactics from Mally’s ecologically-based management suggestions—were major components of the IPM cotton production systems in use 60–100 years later (Adkisson et al. 1982; Walker and Niles 1971; Frisbie 1985b) . The area wide management philosophy was further developed in Texas by Ewing and Parencia (1949, 1950) . They developed community-wide, early-season programs to control overwintering boll weevils with the least possible disruption of natural enemies. In Arkansas, the area-wide IPM concept was the central paradigm for bollworm management communities. Initially conceived and operated by the Arkansas Cooperative Extension Service, these programs began in 1976 and by 1983 they were operational on over 32,000 hectares (80,000 acres), involving over 200 cotton producers (Cochran et al. 1985) . By 1985, thirty percent of the cotton in Arkansas had IPM activities conducted through one of seven community IPM programs (Frisbie 1985b) . Economic surveys indicated participating growers enjoyed benefits of 67 kg/ha (60 lbs/acre) higher lint yield and 4.2 fewer insecticide applications (Frisbie 1985b) .

Area-wide boll weevil diapause control programs were conducted in many southern states in the 1960s and 1970s (Allen 2008). The largest of these was conducted on the Texas High Plains from 1963–1997. For 34 years the program prevented boll weevil infestation of the 1.3 million hectares (3.2 million acres) of cotton on the Texas High Plains (Frisbie 1985b) . Economic evaluation of the program indicated it prevented the loss of 75–125 million bales of cotton and it prevented the use of 3.6–9 million kilograms (8–20 million pounds) of insecticide per year. By preventing the establishment of boll weevil on the Texas High Plains, $12-$20 million per year in increased production costs was avoided (Lacewell et al. 1974) . In the end, however, the program failed due to mild winters and the establishment of 1.7 million hectares (4.2 million acres) of USDA Conservation Reserve Program grasses which served as boll weevil overwintering sites (Leser et al. 1997; Stavinoha and Woodward 2001) .

In corn, crop rotation—conducted on a field-by-field basis, but adopted by producers on an area-wide basis—was effective for many years in reducing populations of western and Mexican corn rootworms . In spite of the non-chemical nature of the tactic, its widespread use exerted significant selection pressure on corn rootworms resulting in the development of western and Mexican corn rootworm biotypes which laid their eggs in non-host crops in the fall, enabling the emerging larvae to infest corn as fields were rotated back into corn production the following spring (Chandler et al. 2008) .

The successes of area-wide approaches to the management of insect pests led Dr. Edward Knipling to develop the Total Population Management (TPM) concept. Following successful application of the area-wide, TPM concept in eradication of screwworm (Cochliomya hominavorax) from the southern U.S.A. (Klassen and Ridgway 2001) , Knipling believed the concept could be used to eradicate the boll weevil (Knipling 1966, 1967, 1968) . He thought that the boll weevil was a good candidate for eradication because it had one host plant throughout most of its range in the U.S.A. His success with screwworm emboldened the cotton grower leadership to accept and embrace the idea that the boll weevil could be eradicated .

Knipling, Robert Coker and J.F. McLaurin led discussions with the National Cotton Council which passed a resolution in 1958 to develop the technology to eradicate the boll weevil from U.S. cotton fields (Knipling 1956; Coker 1958) . This began an intensive effort to fund USDA Agricultural Research Service (ARS) efforts to develop the biological and technical tools that would be needed for boll weevil eradication. In 1960, the U.S. Congress appropriated $1.1 million for construction of the USDA ARS Boll Weevil Research Laboratory on the campus of Mississippi State University. Table 5.1 provides information about the key efforts and advancements which enabled private-public partnerships to successfully conduct boll weevil eradication in the southern U.S. (Davich 1976; McKibben et al. 2001; Allen 2008) .

Table 5.1 Critical technologies for boll weevil eradication and the periods of development

Following the Pilot Boll Weevil Eradication Trial 1971–1973 in Louisiana, Mississippi and Alabama and the Boll Weevil Eradication Trial in northeastern North Carolina and southeastern Virginia 1978–1980, the national boll weevil eradication program began in the USA in 1983. The program began in southern North Carolina and South Carolina on the east and two years later, in California and Arizona in the west (Ridgway and Mussman 2001; Dickerson et al. 2001; Harris and Smith 2001; Clark 2001; Neal and Antilla 2001; Roof 2001; Allen 2008) . It has resulted in eradication of the boll weevil from all U.S. cotton except approximately 60,000 hectares (150,000 acres) near the Rio Grande in South Texas. Boll weevil eradication programs in Mexico have eliminated the pest from the majority of cotton producing lands, including the primary production areas in northwestern Mexico . In Texas, the net cumulative economic benefit of boll weevil eradication from 1998–2010 has been $1.9 billion (McCorkle 2011) .

In the southwestern U.S., a program to eradicate the pink bollworm has also been successful. Through this area-wide effort, pink bollworm has been eradicated from all cotton producing areas in Texas, New Mexico, Arizona, California and northwestern Mexico in which it was previously a significant pest (Personal communication, L. E. Smith 2012; Liesner et al. 2011) . Pink bollworm programs have used a number of tactics including Bt cotton, pheromone mating disruption, insecticides and sterile insect releases (Smith et al. 2012) .

Together, area-wide boll weevil and pink bollworm eradication programs have transformed IPM in southern states and have produced highly positive economic impacts for cotton growers and local economies in the region. In addition, they have greatly reduced the need for insecticides, providing significant environmental and human health benefits .

Highly effective pyrethroid insecticides became available to U.S. cotton producers in 1977 (Bierman 1983) . Once again, growers developed great confidence in the new technology and their use of pyrethroid insecticides soared. On average, 8 applications per year were made on cotton in higher use areas. From one to five Heliothis/Helicoverpa generations were treated annually during the late 1970s and 1980s (Bacheler 1985) .

By 1985—seven years after the pyrethroids became available—field-evolved resistance was reported in tobacco budworm (Heliothis virescens) populations in West Texas (Allen et al. 1987). Resistance was confirmed in the laboratory in both South and West Texas tobacco budworm populations (Plapp et al. 1987) . During the next 10 years, pyrethroid resistance in tobacco budworm spread gradually through the South (Graves et al. 1989, 1991, 1992; Elzen 1995; Elzen et al. 1992, 1997; Hasty et al. 1997) .

Insecticide resistant cotton aphids were another cause for grower concern during the late 1980s and early 1990s. During this period, widespread resistance to multiple classes of insecticides developed in the cotton aphid across much of the South (Allen et al. 1990; Hardee and O’Brien 1990; Kerns and Galor 1991; Reed and Grant 1991; Bagwell et al. 1991; Johnson and Studebaker 1991; Harris and Furr 1993; Layton et al. 1996a) . A few years later, tarnished plant bug populations continued the trend. They too, developed resistance to multiple classes of insecticides (Snodgrass and Elzen 1995; Snodgrass and Scott 1996; Luttrell et al. 1998; Russell et al. 1998) .

Resistance management plans were developed by research and extension entomologists with the goal of sustaining the efficacy of pyrethroid and other insecticide chemistry against tobacco budworm, cotton aphid and tarnished plant bug. They were modeled after plans developed in Australia to preserve pyrethroid efficacy against Heliothis armigera (Sawicki and Denholm 1987; Sawicki 1989) . The plans emphasized earliness, use of field scouting and economic thresholds to determine the need for field treatment, use of alternative insecticide classes, and tank mixes of insecticides from different classes (Fuchs 1994; Bagwell 1996) . Extension Service promotion of resistance management plans and widespread grower and consultant adherence to them sustained the effectiveness of pyrethroids for tobacco budworm and insecticides for cotton aphid and tarnished plant bug from the late 1980s until the mid-1990s when Bt transgenic cotton varieties and novel insecticides for cotton aphid and tarnished plant bug became available (Colburn 1994; Allen 1995; Graves et al. 1995; Bagwell et al. 1991; Bagwell 1996; Furr and Harris 1996; Layton 1994) .

The suppression of natural enemies through repeated use of insecticides resulted in pest resurgence and increasing secondary pest problems. This, along with increasing insecticide resistance in primary pests, led to an escalation in insecticide use in the late 1980s and 1990s. Grower treatments were made during the same years that broad-spectrum malathion treatments were being applied to by boll weevil eradication programs. Natural enemy populations were reduced and pest management systems became unstable, resulting in severe pest outbreaks. Beet armyworm, Spodoptera exigua, outbreaks occurred in Alabama, Georgia, Mississippi, Florida, South Carolina and Texas during the period 1988–1998 (Sprenkel and Austin 1996; Mascarenhas et al. 1998) . Because insecticides were mostly ineffective, high control costs and significant crop loss resulted in outbreak areas (Summy et al. 1996; Sparks et al. 1996) . Serious tobacco budworm outbreaks occurred as well, with similar outcomes. Cotton growers in Alabama, Mississippi and Tennessee experienced highly damaging tobacco budworm outbreaks in 1993, 1994 and 1995 (Layton et al. 1996b; Williams and Layton 1996) . And, in 1991, farmers on the Texas High Plains experienced insecticide induced outbreaks of resistant cotton aphids (Leser et al. 1992) .

6 The Era of Genetically Modified Crops—1996 to Present

6.1 Insect Resistant, Bt Transgenic Crops

In 1996—one year after serious beet armyworm outbreaks in Texas and tobacco budworm outbreaks in Mississippi, Alabama and Tennessee—Bt transgenic cotton and corn first became available in the U.S.A. Not surprisingly, grower adoption of Bt cotton was rapid (Benedict and Ring 2004; Luttrell et al. 2012) . Adoption continued to increase for several years. By 2011, Bt cotton plantings comprised greater than 95 % of land planted to cotton in most production regions of the USA (Luttrell et al. 2012) . Over 58 million hectares of Bt crops, primarily cotton and corn were planted in the U.S.A. in 2010 (James 2010) .

The first year of Bt cotton use, bollworm populations caused crop damage (Carter et al. 1997; Lambert 1997; Pitts et al. 1999) . Damage occurred mid-season in Texas and the Mid-South as bollworms fed on blooms and small bolls deep within the crop canopy. Pyrethroid insecticides were applied to control the worms in some fields. In general, however, in spite of minor to moderate crop losses in some areas, the Bt transgenic cotton (single protein toxin) performed well (Benedict and Ring 2004; Naranjo and Elsworth 2010; Duke 2011) . Caterpillar control and resistance management were improved by the introduction of dual toxin Bt cottons in 2002 (Greenplate et al. 2002; Bacheler and Mott 2003; Catchot and Mullins 2003; Hagerty et al. 2003) .

Resistance management was a part of the agreement when farmers purchased Bt seed and it was a part of the EPA label for Bt crops. EPA considered Bt proteins as plant incorporated protectants (PIPs) resulting in their being regulated. Refuges of non-Bt crops were required. The refuge requirements were based on the ability of a transgenic Bt plant to deliver a high toxin dose. EPA categorized “high dose” as toxin concentrations high enough to kill at least 99.99 % of susceptible insects in the field—survival of less than 0.01 % of larvae on Bt plants compared to larval survival on non-Bt plants (EPA 1998; Tabashnik and Gould 2012) . Modeling projected that a high dose teamed with non-Bt refuge plantings which would produce adults which had not been selected for Bt resistance could forestall resistance development resulting in enhanced sustainability of the technology. In theory, for target pests in which the high dose definition is not met, refuges should be higher than those for pests in which the high dose threshold is met (Gould 1998; Carrière and Tabashnik 2001; EPA 2002; Tabashnik et al. 2004, 2008, 2009) . Availability of additional non-Bt refuge introduces greater numbers of non-selected adults into the environment. Increased percentages of non-selected moths are needed to slow resistance in pests that do not meet the high dose threshold. Unfortunately, neither bollworm nor western corn rootworm, Diabrotica virgifera, meet the high dose threshold (Tabashnik and Gould 2012) . Refuge requirements may not have been set high enough and these pests have proven problematic in Bt cotton and Bt corn (Ali et al. 2006; Luttrell et al. 2004; Porter et al. 2012; Tabashnik et al. 2008, 2009, 2010, 2012) .

In cotton, one or more bollworm sprays on two gene Bt cotton have produced yield increases in recent years . These treatments are commonly made by growers in the mid-South and southeast (Greene et al. 2011; Jackson et al. 2012; Luttrell et al. 2012; Lorenz et al. 2012) .

In corn, western corn rootworm caused “greater than expected damage” to Cry 3Bb1 corn in 2009 . By 2011, damage to transgenic Cry 3Bb1 hybrids had been reported in Illinois, Iowa, Minnesota, Nebraska and South Dakota. Field-evolved resistance in western corn rootworm is believed to be the reason for the damage (Gassmann et al. 2011) . In 2012, the North Central Coordinating Committee NCCC46, consisting of entomologists from land grant institutions and USDA-ARS, wrote a letter pointing out the need for more effective refuge requirements to preserve the effectiveness of the Bt toxins against western corn rootworm (Porter et al. 2012) .

In spite of the inability of Bt cotton to completely control higher populations of bollworm in some locations, the technology has transformed pest management on cotton in the South. Target pests have been brought under almost complete control. And secondary pest outbreaks that in the past developed due to use of insecticides against primary pests and the resulting loss of natural enemies—have been almost completely eliminated (Turnipseed et al. 2001; Shelton et al. 2002; Naranjo and Ellsworth 2003; Head and Dively 2004) . Insecticide applications have been reduced by 50–60 % by the combination of Bt cotton, boll weevil eradication and other advances since 1996 (Roush and Shelton 1997; Chilcutt and Johnson 2004; Naranjo 2011) . A compilation of the Beltwide Cotton Conference, cotton insect loss estimates from southern states Table 5.2, demonstrates how insect losses have been reduced. State losses to insects for the period 1980–1995 averaged 7.51 %, while average losses for the period 1996–2011 averaged 5.17 %; losses after 1996 were reduced by 31 %. Similarly, the number of insecticide applications for the period 1986–1996 averaged 5.61 applications per hectare, compared to an average of 2.98 insect applications per hectare for the period 1996–2011. After 1996, insecticide applications were reduced by 47 % (Hamer 1981; Head 19821998; Williams 19992012) .

Table 5.2 State average losses to insects and insecticide applications on cotton before the Bt cotton became available in 1996 compared with the years since 1996 when Bt cotton was widely used. (Sources:

An analysis by the National Center for Food and Agriculture Policy concluded that quantity of insecticide active ingredient applied to cotton in the USA declined from 0.41 kg/ha in 1995 (the year before the commercial introduction of Bt crops) to 0.13 kg/ha in 2000, a 68 % reduction (Carpenter et al. 2004) . In China and India even greater reductions in pesticide use have been seen since Bt cotton became available (Duke 2011) . Commercialization of Bt crops worldwide has led to a reduction of 140 million kg of insecticide active ingredient on cotton and a 352 million kg reduction of insecticide active ingredient use on all crops between 1996 and 2008 (Brookes and Barfoot 2010; Naranjo 2011) . The reduction in pesticide use has provided significant environmental and human health benefits (Yu et al. 2011) .

Reduced pesticide use due to elimination and suppression of key cotton pests—boll weevil eradication and the use of transgenic Bt cotton—has had overwhelmingly positive impacts on cotton production economics and the environment. But negative impacts have occurred as well. In the reduced insecticide environment, the pest status of sucking bugs has increased, resulting in an increase in insecticide treatments needed to control them. The emergence of stink bugs in the southeast (Greene and Herzog 1999; Roof and Bauer 2002; Steede et al. 2003; Ottens et al. 2005; Greene et al. 2005) and tarnished plant bugs in the mid-South (Luttrell et al. 1998; Johnson et al. 2001; Layton et al. 2003) have required field monitoring and multiple, timely applications of insecticides. Even with the increased treatment for sucking bugs, the total insecticide load on cotton has been greatly reduced by Bt cotton and boll weevil eradication. In addition to insecticide reduction, the adoption of Bt transgenic crops is estimated to have saved 125 million liters of fuel and avoided emission of 344 million kg of CO2 worldwide (Brookes and Barfoot 2010) .

Negative effects of Bt crops on natural enemies have been reported. But comprehensive studies have documented that Bt proteins do not pose direct hazards to natural enemies (Gould 1998; Benedict and Altman 2001; Naranjo 2011) . Bt crops rely less on insecticides, allowing farmers to take greater advantage of natural enemies (Head et al. 2001; Benedict and Ring 2004; Naranjo 2011) . Natural enemy populations are typically higher in untreated Bt cotton than in non-Bt cotton treated with insecticides (Marvier et al. 2007; Wolfenbarger et al. 2008; Naranjo 2009, 2011) . Production systems which include pest resistant cultivars and maintain effective natural enemy populations have greater sustainability because of the increased pest mortality from natural enemies. Bt transgenic systems allow predators and parasites to respond naturally—in a density dependent manner—to the presence (or increase) of pest populations (Benedict and Ring 2004) . Mortality due to natural enemies reduces selection pressure on Bt insecticidal proteins, slowing the development of pest resistance to Bt cotton (Van Emden 1991; Carrière and Tabashnik 2001) .

The impact of Bt crops on non-target organisms has been thoroughly examined. The world literature on the subject was summarized by Yu et al. (2011) . They found that Bt crops do not cause apparent, unexpected, detrimental effects on non-target organisms or their ecological functions. Bt proteins do not accumulate in soils (Head and Dively 2004) . Since Bt protein toxins are only toxic to one or two insect orders, their action is much more targeted than most insecticides (de Maagd et al. 1999) . The proteins kill some major crop pests, but they cause little or no harm to most other organisms—including humans (Mendelsohn et al. 2003; National Research Council 2010) .

6.2 Nematodes and Thrips

For a number of years, treatments for nematodes and thrips have been preventative and area-wide in many areas of the southern US cotton belt. Aldicarb (Temik 15G) was the product of choice for both pest complexes and performed well for many years. A USDA-NAPIAP study in 1993 concluded that aldicarb was the single most valuable pesticide for U.S. cotton growers (Anonymous 1993). Bayer CropScience announced in the fall of 2010 that the marketing of aldicarb would end in the USA in 2014 and EPA declared that its use on all crops would end no later than August 2018. At the farm-level, however, aldicarb was unavailable for the 2011 and 2012 seasons. Without aldicarb, cotton vulnerability and losses to nematodes and thrips were expected to increase and widespread use of preventative seed-treatment and in-furrow, at-planting treatments to control these pests was expected to continue, involving multiple products for nematode and thrips control (Siders 2011).

Following the loss of aldicarb, cotton grower use of seed treatment and in-furrow pesticides continued the preventative and areawide use pattern seen previously when aldicarb was available. Seed treatment and in-furrow treatments to control nematodes, primarily southern root-knot nematode, Meloidogyne incognita, and reniform nematode, Rotylenchus renniformis; and thrips, predominantly tobacco thrips, Frankliniella fusca in most of the cotton belt, and western flower thrips, Frankliniella occidentalis in West Texas. Nationally, some 950,000 bales of cotton per year are lost to nematodes (Davis 2011; Blasingame 19992010) and thrips losses have averaged 121,094 bales per year since 2000 (Williams 20012012) . Annual cotton yield losses to nematodes in the U.S.A. range from two to seven percent of the crop (Haygood et al. 2012) . Based on losses at these levels, the $7.2 billion dollar 2011 cotton crop (NASS 2012) in the U.S.A. suffered losses from nematodes between $144 million to $504 million. And, thrips losses in the U.S.A. averaged $30.8 million per year (Williams 20012012; NASS 2012). Nematode and thrips losses occurred primarily in the southern USA.

Management to reduce losses from root-knot and renniform nematodes has evolved, post-aldicarb, to an increasingly integrated approach. Improved laboratory methods using PCR to quantitatively determine the number of renniform nematodes in soil samples has the potential to reduce the lab time and the cost of evaluating nematode infestation levels (Showmaker et al. 2012) . Multi-temporal remote sensing technologies with information delivery via the internet can provide cotton growers with information on the degree of renniform nematode infestation in fields—without the necessity of taking or processing soil samples (Palacharla et al. 2012) . Systems using various methods including digital elevation modeling, soil electrical conductivity, normalized difference vegetative index, yield maps and geographic information system referenced nematode sampling have been developed allowing growers to specifically target areas of infested fields which can respond to fumigation treatment. The use of fumigants has increased in recent years (Overstreet et al. 2011, 2012; Allen et al. 2012; Haygood et al. 2012; Norton et al. 2012) . In addition, cotton growers are using other nematode management techniques such as nematode tolerant varieties (Anderson et al. 2011; Wheeler et al. 2012) and rotating to non-host crops such as peanuts and grain sorghum (Overstreet and Kirkpatrick 2011). Highly nematode resistant cotton lines have been identified and work is underway to develop elite, nematode resistant varieties (Davis 2011; Nichols 2012) .

To minimize losses from thrips, growers across the south are increasingly adopting preventative seed-treatment and in-furrow insecticides (Akin et al. 2011, 2012; Griffin et al. 2012; Nino and Kerns 2010; Herbert et al. 2012; Roberts et al. 2012) . In most cases, foliar sprays have not increased yields of cotton which has been protected with in-furrow or seed-treatment thrips control insecticides. But, when thrips infestations are high or early season growing conditions are wet and/or cool, foliar thrips treatments can increase cotton yields (Roberts 2012; Akin et al. 2011, 2012) . It is not uncommon for producers to make foliar applications for thrips control at specific growth stages, or with herbicide applications; regardless of thrips numbers or damage potential (Akin et al. 2011, 2012). The consensus of extension entomologists in the South is that foliar sprays following seed treatment or in-furrow insecticide applications should only be made on the basis of the presence thrips above treatment thresholds (Akin et al. 2011, 2012; Roberts et al. 2012). Thrips resistant cotton lines have been identified and breeding for resistant varieties is on-going (Arnold et al. 2010) .

Trends for nematode and thrips management have been, and very likely will continue to be preventative and area-wide.

6.3 Innovations in Managing Mycotoxins and Plant Diseases

6.3.1 Mycotoxins

In recent years plant pathologists and crop protection specialists have made a number of significant advances in the management of long-unsolved plant disease/mycotoxin issues affecting agriculture in the southern U.S.A. The development of technology which has allowed farmers to reduce aflatoxins in corn, peanut, and cotton seed has been critically important in their ability grow these crops profitably and produce crops which can be safely consumed by humans and animals .

Aflatoxins—a group of mycotoxins which are very important in the southern USA—are extremely toxic compounds produced by some biotypes of Aspergillus section flavi and A. parasiticus. These ubiquitous fungi infect many crops (Diener et al. 1987; Cotty 1994) . They have been problematic in the hot, humid growing conditions typically present in the South. Hot years with low rainfall often result in aflatoxin contamination in corn and other crops .

Aflatoxins are considered among the world’s most serious food safety problems. They were first identified in the 1960s following the death of a large number of turkeys in Britain. Scientists studying that incident found high levels of aflatoxins in imported peanut meal used in the turkey feed (Robens 2008) . The presence of aflatoxins in human food causes both acute and chronic effects—aflatoxicoses—ranging from immune system suppression, to growth retardation, and cancer. Human deaths can result from acute poisoning (Gong et al. 2002; Wild and Turner 2002) . Occasional outbreaks of aflatoxin poisoning in humans has occurred in Africa. In 1966–1967 and 2004, outbreaks of aflatoxin poisoning in Uganda and Kenya, respectively, caused human illness and death. In both instances, consumption of highly contaminated corn—which was produced during drought years and stored improperly—was identified as the cause. Those affected developed jaundice, after which the mortality rate was high (Probst et al. 2007) . High concentrations of aflatoxin in human food have been positively associated with the incidence of liver cancer (Wild and Hall 2000) . In the U.S.A., health risks to humans, livestock and wildlife, and the reduced profitability of contaminated crops have strongly motivated farmers to prevent the formation of aflatoxins in the field (Park et al. 1988) .

Many countries have implemented regulations which limit the concentration of aflatoxins allowable in food and feeds (Haumann 1995) . In the U.S.A., the U.S. Food and Drug Administration (FDA) limits aflatoxins in food or feed in interstate commerce to 20 ppb and in milk to 0.5 ppb (Brown et al. 1991; Gourma and Bullerman 1995) . Crops containing over 100–300 ppb cannot be legally fed to animals in the U.S.A. (FDA 2012).

Economic losses to farmers who have produced aflatoxin contaminated corn (or other commodities) are high. Monitoring, research and lost sales are estimated at between $0.5 and $1.5 billion annually (Robens and Cardwell 2003; Bruns and Abbas 2006) . During a drought in 1998, losses from inability to market aflatoxin contaminated corn in Texas, Arkansas, Mississippi and Louisiana were estimated at $85 million (Williams et al. 2003; Abbas et al. 2002, 2006).

Following widespread aflatoxin contamination of crops in the U.S. Corn Belt in 1988, commodity groups pushed for additional funding for research (Cole and Cotty 1990; Robens et al. 1990) . This resulted in the formation of the Multi-crop Aflatoxin Working Group, composed of land grant university and USDA scientists to address the aflatoxin problem in cotton, corn, peanuts and tree nuts (Robens et al. 1990) . Stakeholder groups provided multi-crop funds and USDA-ARS funding for aflatoxin research was increased. multi-crop funds totaled $750,000 in 2007. That year 28 research/extension projects were funded—ten on corn, nine on peanut, four on cotton seed, and four on tree nuts. USDA –ARS funding for aflatoxin research was increased 3.75 fold from 1982 to 2006 (Robens 2008) (Table 5.3).

Table 5.3 USDA-ARS funding for aflatoxin research 1982–2006

Cultural practices such as manipulation of planting dates to avoid heat/water stress during kernel filling (Abbas et al. 2007) , manipulation of harvest dates (Bock and Cotty 1999) , improving irrigation practices (Russell et al. 1976) , improving harvest methods (Russell et al. 1981), and improving storage practices (Batson et al. 1997) have been shown to reduce aflatoxin contamination of agricultural products. Furrow-diking fields was a specific irrigation method which reduced aflatoxin in southeastern U.S.A. peanut, cotton and corn fields (Nuti et al. 2007) . And, prevention of root infection by the peanut root nematode was shown to reduce aflatoxin contamination in peanut (Timper and Holbrook 2007) .

Infestation by insect pests can increase the levels of mycotoxins—primarily aflatoxins and fumonosins (from Fusarium spp.). Insects carry fungal spores and cause damage which permits entry of the fungus. One of the positive benefits of Bt corn has been a reduction of insect damage and lower aflatoxin and fumonosin contamination (Benedict et al. 1998; Munkvold et al. 1999; Dowd 2001; Bakan et al. 2002; Williams et al. 2002; Hammond et al. 2003; Williams et al. 2004; Wiatrack et al. 2005; Bruns and Abbas 2006) . Planting of Bt corn has resulted in an estimated $23 million per year benefit from reduced aflatoxin and fumonosin contamination of the crop (Wu 2006) .

Competitive displacement of toxicogenic fungi is a novel biocontrol strategy to reduce aflatoxin contamination. This IPM strategy was pioneered by USDA-ARS scientists and strongly supported by grower groups and agricultural business partners (Cotty et al. 2007; Robens 2008) . Public sector research scientists found that certain lines of A. flavus do not produce aflatoxin. They theorized that atoxigenic strains could be used to competitively displace and exclude the naturally present aflatoxin producing strains. After collecting and characterizing more than 10,000 isolates of the fungus (Cotty 1994) , a competitive, atoxigenic line was selected—AF36. It could be produced simply on wheat seed, was stable in storage, was stable when applied to fields and could remain dormant until conditions became conducive for growth of the fungus. It was easy to use and easy to transport (Cotty et al. 2007) . When applied to fields contaminated with toxigenic strains of the fungus, relatively small quantities of atoxigenic strains were found to shift the composition of A. flavus communities without increasing either the quantity of the fungus on the crop or the amount of the crop infected. A single application of 11.1 kg/ha (10 lbs./acre) of colonized wheat seed can produce significant shifts in A. flavus communities. A single application has been demonstrated to change A. flavus from 1–2 % atoxigenic strains to 80 % atoxigenic strains (Cotty et al. 2007) . K-49, another atoxigenic A. flavus strain isolated from corn kernels in Mississippi, has been shown to reduce aflatoxin contamination of crops by 67–94 % (Abbas et al. 2006) . Aflagard®, a third atoxigenic A. flavus strain, developed and labeled through research funding by the National Peanut Research Center has shown positive results as well (Dorner 2004) .

AF36 was first registered for use in the U.S.A. through an Experimental Use Permit in 1996. It received EPA Section 3 Federal Registration for use in Texas and Arizona in 2003 and was labeled for use in California in 2004 (Cotty et al. 2007) .It was soon discovered that AF36 treatment could positively impact ratios of atoxigenic to aflatoxin producing A. flavus strains for multiple years. Evidence suggests that inoculation of fields with multiple atoxigenic strains can lead to more complex and stable fungal communities and provide resistance to re-establishment of strains capable of producing aflatoxins.

Timing the application of atoxigenic strains to coincide with conditions that favor fungal establishment is an important component in suppressing aflatoxin development in crops. Use of atoxigenic strains of A. flavus is an effective tactic for reducing aflatoxin production in peanut, corn and cotton (Degola et al. 2011) .

Public sector development and ownership of atoxigenic A. flavus lines has helped assure that the technology will continue to be available to farmers at a reasonable cost (Cotty et al. 2007) . Because of the long term and area-wide effects of the atoxigenic strain technology (Cotty et al. 2007) and its current use by southern corn farmers throughout affected corn-producing regions, the use of atoxigenic fungi to competitively displace toxigenic strains is another example of a preventative, area-wide IPM tactic.

Corn lines have been discovered which have resistance to A. flavus infection (Warner et al. 1992; Campbell and White 1995; Scott and Zummo 1998) . However, their poor agronomic quality has rendered them of little commercial value (Brown et al. 1999) . Two resistant lines have been released by USDA-ARS in cooperation with the Mississippi Agricultural and Forestry Experiment Station as sources of resistance to aflatoxin in corn breeding programs. In field tests they have been reliable sources of high levels of resistance (Williams and Windham 2012) . Genetic and molecular analysis and mapping suggest multiple mechanisms are involved in the aflatoxin defense systems (Kelley et al. 2012) . Work to characterize the proteins which confer resistance (Baker et al. 2009; Brown et al. 2010) and development of markers to facilitate transfer of the genes coding for them (Brown et al. 2010) is on-going. Resistance-associated proteins in A. flavus resistant plants are expected to contribute to the development of aflatoxin-resistant corn lines and aid in the development of other A. flavus resistant crops (Brown et al. 2010) .

Currently, cultural practices and biocontrol (use of competitive atoxigenic fungal strains) are reducing the levels of aflatoxin contamination in previously affected crops. The effort to develop elite, aflatoxin-resistant cultivars is expected to add to the suite of tactics that can be integrated into aflatoxin management programs.Resistant cultivars are eventually expected to further reduce aflatoxin contamination of crops in the southern U.S.A. The currently available tactics and those under development for reducing aflatoxin contamination are preventative. Growers have adopted the available tactics for aflatoxin prevention on an areawide basis and it is expected that they will adopt aflatoxin resistant cultivars on an areawide basis as well.

6.3.2 Verticillium Wilt

Verticillium wilt, caused by the fungus Verticillium dahlia, is a destructive disease that damages cotton plantings in irrigated and high rainfall regions of the South. V. dahlia is a soil-borne pathogen which causes damage to the plant vascular system resulting in plant wilting, defoliation and crop yield and quality losses (Wang et al. 2008) . Some 1.5 million bales of cotton are lost annually to the disease (Bell 1992) . While tolerance to the disease is available in certain commercially available cultivars (Cano-Rios and Davis 1981; Wheeler and Woodward 2011) , high levels of resistance to the fungus is known in Sea Island and Pima S-7 cultivars (Wang et al. 2008; Bolek et al. 2005) . Genetic and molecular techniques are being used to map and isolate the genes conferring resistance to reduce verticillium wilt either through conventional breeding or transgenic techniques (Bolek et al. 2005; Wang et al. 2008) . As with aflatoxin resistance, cultivars with high levels of verticillium wilt resistance are eventually expected to greatly reduce damage from this disease. Tolerant cotton varieties are currently being used preventatively and resistant varieties are expected to be used in a similar manner when they become available. Currently in cotton growing areas affected with verticillium wilt, varieties conferring tolerance are used on an area-wide basis. When resistant varieties become available, it is expected that they too will be used area-wide .

6.3.3 Cotton Root Rot

Cotton root rot , caused by the fungus Phymatotrichopsis omnivorum, is another important disease of cotton in the South. This soil-borne pathogen causes plant vascular damage, premature defoliation and loss of yield and quality on certain alkaline soils in Texas, other southwestern states and Mexico. Some 1.5 million acres (648,000 hectares) of cotton in Texas are affected annually and, in spite of farmers’ use of crop rotation and other cultural practices, losses are estimated at $29 million per year in Texas (G. D. Morgan. 2011. Unpublished Report, p. 5) .

In 2009 field fungicide screening trials conducted by Texas A&M AgriLife Extension Service identified an effective fungicide treatment, flutriafol (Isakeit et al. 2012) . Field tests were conducted to determine appropriate application methods and use rates. The fungicide effectively controls cotton root rot when it is applied to the soil, near the seed at planting. A Section 18 Emergency Exemption label allowed use of the product on of some 275,000 acres (111,000 hectares) of cotton in Texas in 2012. For the first time in 150 years, Texas cotton farmers have an effective, preventative treatment for managing destructive cotton root rot in cotton fields. The development of flutriafol for cotton root rot control by public sector plant protection specialists has resulted in area-wide use of the technology in root rot prone areas. Public sector plant protection specialists are working to develop the techniques and information to allow use of the product on other root rot prone crops such as grapes and tree fruits in the southern U.S.A.

6.4 Herbicide Tolerant Crops

The era of weed management with synthetic herbicides began with the introduction of 2,4-D in the early 1950s and herbicide use increased rapidly through the 1960s (Timmons 2005). Herbicide treated farm land in the increased from 30 million ha (90 million acres) in 1962 to 87 million ha (215 million acres) by the mid-2000s (Timmons 2005; Gianessi and Reigner 2007) . The combination of herbicides and tillage made it possible for farmers to control weeds that were not previously controlled by tillage alone. Concerns about the possibility of herbicide resistance were first realized when common groundsel became resistant to triazine herbicides in Washington state in 1968 (Ryan 1970; Ross and Lambi 1999; Hager and Sprague 2000) .

Prior to the introduction of herbicide tolerant crops, weed control strategies included a pre-plant-incorporated (PPI) or pre-emergence (PRE) herbicide or both to prevent weed germination and establishment. These applications were followed by post-emergence (POST) or post-directed (PDS) treatments to control weeds growing after crop emergence (Price et al. 2011) . Use of these systems required a comparatively higher level of knowledge and skill than has been required since the advent of herbicide tolerant crops (HTCs). Before HTCs, farmers had to carefully select among a range of herbicide active ingredients and carefully manage herbicide application rates and timing. In addition, they had to integrate chemical and non-chemical practices to control weeds without damaging the crop (Mortensen et al. 2012) . Prior to the release of glyphosate resistant soybeans, weed control in soybeans was typically a two pass system that utilized a PRE herbicide for grass and limited broadleaf weed control followed by a selective POST herbicide application (Price et al. 2011). Nonselective herbicides such as glyphosate were rarely used for weed control after crop emergence (Duke 2011) .

Since its commercial introduction in 1974, glyphosate has become the dominant herbicide worldwide. It is a highly effective, broad-spectrum herbicide, yet it is toxicologically and environmentally safe. It is relatively slow acting and translocates well, allowing it to be transported through plants before transport systems are affected. Glyphosate is the only herbicide that targets the 5-enolpyruvylshikimate-3-phosphate synthase (EPSPS) enzyme required for production of aromatic amino acids (Duke and Powles 2008; Schönbrunn et al. 2001) . It is considered the world’s most important herbicide (Powles 2008). Until recently there were relatively few reports of weedy plant species which had evolved resistance to glyphosate (Powles 2008). In the 1990s Monsanto considered there was very low risk of the evolution of glyphosate resistant weeds (Bradshaw et al. 1997; Owen 2011) .

In 1996, glyphosate-resistant (GR) soybeans were commercially introduced, soon followed by the introduction of GR cotton and corn cultivars (Young 2006; Webster and Nichols 2012) . All three glyphosate resistant crops (GRCs) were very popular and widely adopted because of the utility, reliability and ease of application of broad-spectrum glyphosate (Duke and Powles 2008) . The low perceived threat of weed resistance to glyphosate was the rationale for release of the technology without an integrated weed management (IWM) plan to reduce selection pressure on glyphosate and delay resistance development. IWM requirements were not mandated or generally promoted by weed scientists (Bonny 2011) . Resistance management practices were not viewed as being economical and were not readily used by farmers (Webster and Sosnoskie 2010) . Two factors were cited by farmers as reasons they did not adopt IWM practices to manage weed resistance to glyphosate. They believed resistance management practices would be futile and they believed new technologies would be developed to solve resistance problems (Webster and Sosnoskie 2010). Even with the higher cost of GRC seed, the technology simplified and generally lowered the costs associated with weed management (Duke 2011) . Farmer use of GR cotton in the U.S.A. grew to over 70 % of the total farmland planted to cotton in less than ten years (Price et al. 2011) .

One immediate effect of widespread farmer adoption of GRCs was a significant expansion in the use of glyphosate and a reduction in the use of other herbicide modes of action (Givens et al. 2009a) .Twenty percent of the land planted to soybeans on U.S. farmland was treated with glyphosate in 1995 . By 2006, 96 % of U.S. soybeans received glyphosate treatments (Bonny 2011) . Use of other herbicides declined . Imazethapyr was used on 44 % of U.S. soybeans in 1995, but on only three percent of U.S. soybean plantings by 2006 (Bonny 2011) . The US patent for glyphosate expired in 2000. Afterward, generic glyphosate was marketed, competition was fierce and glyphosate became significantly less expensive (Bonny 2011) . Chemical/seed companies increasingly consolidated and it became more difficult for farmers to find high-yielding varieties/hybrids which did not include transgenic herbicide-resistant traits (Mortensen et al. 2012) . The increasing use of glyphosate in U.S. agriculture is shown in Table 5.4.

Table 5.4 Tons of glyphosate used in US agriculture

Expansion of conservation tillage was one of the significant benefits of the availability of GRCs and cheap, effective glyphosate. Glyphosate’s broad spectrum of activity gave growers the capacity and confidence to eliminate primary tillage and cultivation as weed management tools (Givens et al. 2009b) . Their ability to use glyphosate in POST applications to control weeds in GRCs facilitated extensive adoption of conservation tillage in several crops, but especially in cotton . By 2000, more than 44 million ha (109 million acres) of US cropland had been converted to conservation tillage (Sandretto 2001) . Price et al. (2011) reported 46 million ha (114 million acres) of farmland in the USA were farmed using conservation tillage by 2010.

Conservation tillage has been thought of primarily as a method of reducing soil erosion by wind and water (Le Bissonnais 1990; Baumhardt and Lascano 1996; Truman et al. 2005) . However, there are many other benefits including: increased organic matter at the soil surface (Rasmussen and Collins 1991; Reeves 1994, 1997; Truman et al. 2003) , increased diversity and numbers of soil organisms (Kemper and Derpsch 1981; Rasmussen and Collins 1991; Bruce et al. 1992; Heisler 1998; Lupwayi et al. 2001; Kladivko 2001; Holland 2002; Riley et al. 2005; Brévault et al. 2007) , reduced runoff (Reeves 1994, 1997; Truman et al. 2003; Banerjee et al. 2009) , improved water infiltration (Kemper and Derpsch 1981; Bruce et al. 1992; Truman et al. 2003; Banerjee et al. 2009) , improved soil surface sediment, improved soil aggregate stability, reduced soil crust formation (Bruce et al. 1992; Banerjee et al. 2009) , reduced chemical runoff (Banerjee et al. 2009) , improved water availability and water holding capacity (Hudson 1994; Reeves 1994, 1997; Kaspar et al. 2001) , improved biological control of insect pests (Stinner and House 1990; Hammond and Stinner 1999; Kromp 1999) , increased carbon sequestration (Baker and Saxton 2007) and reduced carbon emissions (Brookes and Barfoot 2006) . Because of the numerous benefits of conservation tillage, it is a fundamental component of agricultural sustainability (Price et al. 2011) .

Conservation tillage generally produces greater economic returns and lower production costs compared with conventional systems (Raper et al. 1994; Smart and Bradford 1999) . Some of the savings are in lower fuel costs, reduced labor costs, and lower machinery inputs (Lithourgidis et al. 2006) . In southern U.S. cotton production, the costs of no-till and strip tillage systems were lower than or equal to conventional tillage systems (Schwab et al. 2002) . Yields of no-till corn and soybean tended to be greater in conservation tillage than in conventional tillage in the south and west regions of the U.S.A., but similar in the central U.S.A. In the northern U.S.A. and Canada, no-till systems produced lower yields (DeFelice et al. 2006) . Economic analyses indicate that conservation tillage systems are not riskier than conventional tillage systems, even in the short term (Baker and Saxton 2007) .

The high level of grower adoption of available GRCs and their reliance on glyphosate alone or with very limited use of alternative weed control practices resulted in high selection pressure on weeds to evolve resistance to glyphosate and led to the development of highly problematic, glyphosate-resistant weeds (Duke 2011; Bonny 2011) . In southern cotton-growing states, the most serious glyphosate-resistant weed threat is from Palmer amaranth, Amaranthus palmeri (Heap 2007; Culpepper et al. 2006; Culpepper et al. 2007) . Since the first confirmed case of glyphosate resistant Palmer amaranth in Georgia in 2005, resistant biotypes have been reported in Alabama, Arkansas, Florida, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee and Texas (Culpepper et al. 2006; Norsworthy et al. 2008; Steckel et al. 2008; Nichols et al. 2009; Dotray et al. 2012) . In addition, glyphosate resistant populations of common waterhemp, Amaranthus rudis, are present in several southern states (Light et al. 2010) . As of 2009, glyphosate resistant Amaranthus species infested 1.2 million hectares (3 million acres) of farmland in the U.S.A. (Heap 2009; Light et al. 2010) . Along with resistant Amaranthus sp., glyphosate resistant populations of Italian ryegrass, Lolium multiflorum, have been identified in Mississippi, Louisiana, Arkansas, and North Carolina (Bond et al. 2012); glyphosate resistant populations of horseweed, Conyza canadensis, occur in many southern states. The problem of glyphosate resistant weeds became severe enough in 2010 to motivate hearings in the U.S. Congress to assess whether additional government oversight was needed to address the problem of herbicide resistant weeds (US House Committee on Oversight and Government Reform 2010).

In the wake of grower overuse of glyphosate on GRCs, extension and research weed scientists are working to promote broad-based, multi-tactic IWM systems for weed management (Mortensen et al. 2012; Bonny 2011; Harrington et al. 2009) . They advocate increased research, alternating herbicide modes of action, alternating crops, use of cover crops and judicious use of tillage (Culpepper et al. 2010; Culpepper et al. 2011; DeVore et al. 2011; Price et al. 2011; Mortensen et al. 2012) . Southern farmers are increasingly using residual PRE herbicides (Steckel 2012). The chemical/seed industry response is to develop and release crops resistant to multiple broad-spectrum herbicides (Carpenter and Gianessi 2010; Feng et al. 2010; Green and Owen 2011; Adler 2011; Duke and Powles 2008; Gerwick 2010; Mortensen et al. 2012) . However, the herbicide resistant crops being developed and released are tolerant to herbicides with modes of action that have been used for decades (Duke 2011) . In order to achieve more sustainable systems, farmers must reduce selection pressure on any single control tactic or herbicide through use of multiple tactics. As a part of this strategy they must utilize herbicide programs which rely on products with different modes of action (Powles 2008; Duke and Powles 2009; Dotray et al. 2012; Mortensen et al. 2012).

GR weeds demonstrate the vulnerability of widely-used systems that are dependent on a single weed control technology . That critical fault now threatens the sustainability of conservation tillage (Culpepper et al. 2006; Price et al. 2011) . Declining farmland in conservation tillage is inevitable unless integrated, effective weed control strategies which include crop and herbicide rotation, and use of cover crops are quickly developed and deployed (Price et al. 2011) . GR weeds are making tillage more desirable as an additional management tool in weed control systems which utilize herbicide-resistant crops (Duke and Powles 2008) .

In spite of the “as-needed” nature of foliar glyphosate use on GRCs, the pre-plant decision to plant a GRC, their widespread use, and the intensive use of glyphosate on GRCs have many of the characteristics of area-wide, and preventative pest management approaches (Duke 2011; Price et al. 2011) . Rapid adoption by American farmers of conservation tillage—primarily due to the availability of the effective GRC/glyphosate weed management system—demonstrates the increasingly area-wide nature of modern farming systems (Powles 2008; Price et al. 2011; Bonny 2011; Webster and Nichols 2012) .

7 World Agricultural Challenges and Status

7.1 Challenges—World Population Growth

Earth’s human population reached 7 billion persons in 2011 (James 2011) . There is an urgent need to increase the world food and fiber supply as the population is projected to increase by one billion people every 10–12 years through 2050 (Kang 2005) . By the mid- 21st Century, farmers will be challenged to feed and clothe another 4 billion people. The future security of the food supply will depend on science developing technology and IPM practitioners integrating it intelligently into production systems which maximize its effectiveness and longevity. The resulting integrated, multi-tactic systems will enable crop producers to grow crops efficiently and sustainably (Christou et al. 2006) . Multi-disciplinary systems approaches will be needed (Kang 2005) and local IPM practitioners to aid farmers in adoption of the best IPM tactics for their farms will be essential. Teams of agricultural specialists will be critically important in helping farmers to meet the increasing needs of the human population while minimizing environmental degradation.

7.2 Current Status—Genetically Modified Crops

For the U.S., 2012 was the 17th year of commercialization of genetically modified crops. Worldwide, biotech crops were planted on 160 million hectares in 2011, up 12 million hectares (8 %) from 2010. Worldwide, adoption of the technology had increased from 1.7 million hectares in 1996 to 160 million hectares in 2011, making genetically modified crops the most quickly adopted crop technology in the history of modern agriculture (James 2011) .

Global economic gains at the farm level of approximately US$78 billion were generated by genetically modified crops during the last fifteen years. Forty percent of these gains were from reduced production costs (reduced pesticide use, less labor, less tillage) and 60 % from yield gains (276 million tons) (James 2011) .

From the environmental perspective, by 2010, worldwide reductions in fossil fuel and pesticide use because of widespread adoption of genetically modified crops resulted in a 1.7 billion kg reduction of CO2 emissions. In addition, increased use of conservation tillage led to an additional 17.6 billion kg of CO2 sequestered by the soil by 2010 (James 2011; Brookes and Barfoot 2012) .

7.3 Changes in IPM Systems

During the first 50 years of IPM, tactics were predominantly applied at the individual field level (Brewer and Goodell 2012) . Field-based IPM was effective in encouraging IPM adoption, improving pest management and minimizing adverse environmental effects associated with pesticide use. The adoption of ecologically-based cultural practices, biological control, pest scouting and economic thresholds brought about reduced pesticide use, lower risks to human health and less environmental pollution (Harris 2001; Smith et al. 2002; Benedict and Ring 2004; Brewer and Goodell 2012) .

Evolution of IPM systems has occurred since the early 1960s. Ecologically-based systems with as-needed insecticide applications based on scouting and economic thresholds have evolved to increasingly preventative systems implemented on a field-by-field basis. These field-based systems have been adopted on such a wide scale that they have, in effect, become area-wide IPM systems. The successes of boll weevil and pink bollworm eradication programs (Personal communication, L. E. Smith, 2012; Allen 2008) and the adoption of pest/herbicide resistant crops have had large, area-wide impacts and have dramatically changed IPM in field crops . Other authors have recognized widely adopted Bt technology as area-wide IPM (Carrière et al. 2003; Adamczyk and Hubbard 2006; Naranjo 2011; Hutchison et al. 2010) . Other tactics such as the use of seed treatments, disease resistant varieties, atoxigenic A. flavus strains for biological control of aflatoxins, etc. exemplify the continuing evolution of agriculture in the U.S.A. in the direction of preventative and area-wide IPM systems. Weed control systems also have elements of area-wide impacts due to area-wide planting of GRC seed and areawide, repeated use of glyphosate (Duke 2011; Price et al. 2011) . And, widespread adoption of transgenic weed control technology has supported area-wide adoption of conservation tillage practices (Powles 2008; Price et al. 2011; Bonny 2011; Webster and Nichols 2012) .

Time savings have been one of the benefits of farmer adoption of transgenic, herbicide tolerant crops on their farms (Bonny 2011) . Many southern farmers have invested the time they save by farming GRCs into increasing the land they farmed. Farms across the region have expanded, farmers have parked or sold plows and the large tractors used to pull them, and invested in large, efficient sprayers.

7.4 Impact of Changing IPM Systems on Infrastructure Supporting Crop Production

The author’s initial notion that the IPM infrastructure supporting agricultural producers had changed significantly, was the result of considerations of the changes which have occurred in the Lower Rio Grande Valley (LRGV) of Texas. The area is convenient for study because it is relatively small (3 counties) and is isolated from other production regions. The author collaborated in this case-study with John Norman, IPM Agent-retired and current crop consultant with 37 years’ experience in IPM in the LRGV (Personal communication, J. Norman 2012) . The case study compared resources available to growers in 1980 to those available in 2012. In 1980 the Texas Agricultural Research and Extension Center and the USDA Kika de la Garza Subtropical Research Center were fully staffed and conducting extensive agricultural research and extension programming including significant work related to IPM. In 2012, the USDA facility was closed and the land grant Research and Extension Center is operating at reduced strength. In 1980 there were 35–40 local chemical/seed company field men scouting crops and assisting producers—in 2012 there were 12. In 1980 there were about 18 crop consultants working in the LRGV—in 2012 there were five. In 1980 there were thirty or more aerial spraying services—in 2012 there were five.

The LRGV case-study suggested that the infrastructure supporting farmers had significantly diminished over the last 30 years. It was the basis for the hypothesis that the change from major-pest driven, field-specific IPM to increasingly preventative, area-wide IPM has led to a decrease in the resources supporting field-specific IPM across the southern U.S.A.

Information on the numbers of crop consultants was obtained though state regulatory agency licensing records. Data were available for Louisiana and Arkansas. For Louisiana, records of licensed agricultural consultants were available from 2005 to 2011 (CPARD 2012). In 2005 there were 282 licensed consultants in Louisiana and by 2011 there were 183—a 35 % reduction. In Arkansas, similar records were available from the Arkansas State Plant Board (2012). There were 343 licensed agricultural consultants in Arkansas in 2006, and 248 licensed consultants by 2012—a reduction of 28 %. The author conducted a survey of southern state extension entomologists in September of 2012. Twenty-eight surveys were sent and 15 were returned. Forty-seven percent of the respondents indicated that fewer consultants were working in their area or state compared with five years ago while 53 % said the number of consultants in their area or state had not changed. None of the respondents indicated that the number of consultants had increased. Averaged across respondents, the number of consultants reported had decreased nine percent in the last five years.

The 2012 CPARD database, a repository of pesticide applicator information from states in the U.S.A., was used to answer the question, “Have crop production system changes affected numbers of licensed commercial pesticide applicators?” Data for Alabama, Arkansas, Florida, Georgia, Louisiana, Missouri, North Carolina, Oklahoma, South Carolina, Tennessee, Texas and Virginia were available for the period 2005–2011. In 2005 there were 14,703 registered commercial applicators operating in those states. By 2011 there were 13,684—a reduction of 6.9 % in six years (CPARD 2012). Texas data, provided by Texas Department of Agriculture (2012) provided a comparison over a longer window of time. In 2000 there were 2,482 licensed applicators in the crop protection category. In 2011 there were 1,745—a 30 % reduction during eleven years (Texas Department of Agriculture 2012).

Florida, Georgia, Louisiana, South Carolina, Tennessee, Texas and Virginia have separate licensing categories for commercial aerial applicators. There were 1,588 aerial applicators licensed in these states in 2005. By 2011 there were 1,413—an 11% reduction in six years (CPARD 2012). Texas Department of Agriculture (2012) records from 2000 to 2011 showed 746 commercial aerial applicators in 2000, and 543 licensed aerial applicators in 2011—a 27 % reduction in eleven years. The 2012 extension survey was further indicative of changes in numbers aerial applicators. Forty-seven percent of respondents indicated there were fewer aerial applicators compared with five years ago. Fifty-three percent said the numbers of aerial applicators was unchanged over the last five years. None of the respondents indicated that the number of aerial applicators had increased. The average of survey respondents’ estimates indicated an 11 % reduction in aerial applicators in the last five years.

Extension resources supporting growers are also on the decline. Extension survey respondents unanimously reported that there were fewer extension personnel working on cotton now compared with five years ago . The average reduction in personnel reported in the 2012 survey was 33 % over the last five years. In Texas, the number of IPM Agents and Extension Entomologists has decreased 45 % during the last 20 years (Personal communication, J. Thomas 2012) .

8 The Future—Challenges and Consequences

The sustainability of the highly successful technologies which have delivered the impressive benefits documented in this chapter (and many others which were not discussed) is dependent on our ability to use technologies wisely. History has repeatedly demonstrated our ability to develop powerful pest protection technologies, adopt them rapidly and use them exclusively with remarkable impacts on pests and farm economies. And, history has repeatedly documented our over-use of new technologies, followed by resistance and other problems within a few years. Again and again we have underestimated the impact of selection pressure on pests. In our enthusiasm to embrace the new technology, we have often failed to integrate other management practices which might have been used to reduce selection pressure, shortening the effective life of valuable technologies. Failure to integrate tactics has prevented us from developing sustainable systems consisting of broad suites of tactics which would reduce selection pressure on any single tactic. The number of technologies man can exploit to manage pests is limited. We can ill afford to continue to overuse them and, in so doing, strongly select for pests which can survive them—resulting in pest resistance and premature failure of the technology. Well trained, and effective public sector plant protection specialists are sorely needed to work with and educate farmers about the importance of using integrated tactics for managing pests.

The recent failure of systems involving GRCs has forced farmers to partially or completely revert to crop management systems that were in place prior to the introduction of genetically modified crops in 1996. Use of residual herbicides and tillage are on the increase in resistant weed-affected areas of the South. As a result, growers and society stand to lose many of the benefits of conservation tillage. Growers who expanded their farms based on the effectiveness of GRC-based systems and the time savings they have provided may find that they must now farm less land. They may have to reinvest in large tractors and plows and are likely to face economic losses.

The failure of transgenic insect resistant crop technology may have even more dire consequences. The human and equipment resources which would be needed to allow growers to revert to the field-specific pest management systems in use prior to the introduction of genetically modified crops are not available. Gone are the days when farmers had sufficient numbers of consultants, extension personnel, pesticide applicators –both aerial and ground—aircraft and other resources to conduct the field-specific IPM in the manner it was conducted prior to 1996. Colleges are no longer training sufficient numbers of students in field-specific IPM. Academic departments with crop protection emphasis have evolved and now emphasize molecular and genetic approaches to IPM. Several years would be needed for colleges to hire faculty with field-specific IPM experience and skills, and begin to train the numbers of students needed by farmers to enable them to transition back to field-specific IPM as it was conducted prior to 1996. In the meantime, losses would mount and the preventative use of foliar insecticides would increase. As has happened with the development of glyphosate resistant weeds—economic, human health and environmental costs would escalate. In the absence sufficient numbers of crop protection specialists, and with high commodity prices associated with the increased demand stimulated by a growing world population; the likely farmer response would be to revert to weekly foliar insecticide applications to protect their valuable crops from damaging insect pests. Under this scenario, pest management systems would revert to the preventative spray technology of the 1940s and 1950s, and—reminiscent of the current conservation tillage situation—the advances of the last 60 years would be lost. Under this scenario, agricultural production may become stagnant.

Since the boll weevil crossed the Rio Grande, public sector research and extension scientists with USDA and land grant universities have led the way—developing and testing new pest control technologies and educating farmers . Extension agents and specialists have guided farmers in the adoption of new management strategies and integrated technologies to help them be successful. Eradication programs, and transgenic and other technologies have greatly improved agriculture, but much more remains to be done. Highly effective, single tactic IPM technologies have produced great benefits for American agriculture and the public but are unsustainable if they are not integrated broadly in systems to reduce selection pressure on pests. Use of diverse pest management tactics—which include ecologically-based IPM and resistance management strategies—are critical to the long-term stewardship of transgenic and other preventative, area-wide technologies. Integration of components and concepts into effective IPM programs has been historically achieved at the local (county) level by public sector research and extension personnel. Multiple tactic integrated systems of this kind are rarely developed or promoted by the chemical/seed industry because they do not produce corporate profits in the short term. And, they are not often conceived of or deployed initially by consultants whose focus is managing pests in farmer’s fields on a week to week basis. The work of development, testing and deployment of integrated IPM systems is most often accomplished by public sector agricultural professionals. Without integration of technologies into multi-tactic IPM systems, transgenic and other areawide technologies can be expected to fail within a few years.

Public sector agricultural research and extension work -developing and demonstrating IPM and other farming technologies, and providing farmers with the opportunities to learn from unbiased information sources is critically important at this point in history. The growing human population, risks of pest resistance, and diminished private sector infrastructure supporting farmers highlight the need for highly efficient crop production systems and increased support for farmers. Numbers of private consultants are driven by grower demand, but government can and should rebuild public sector crop production and crop protection capabilities within USDA and the land grant universities. American agriculture must be highly efficient if it is to keep pace with the world’s increasing demands. It is doubtful that American farmers can achieve and maintain this level of efficiency without robust research and extension programs. The need for public sector research and extension is as great now as at any time in the past—and funding for these critical services has not kept pace.

American agriculture is held in high regard world-wide. Without strong research and extension programs, our ability to produce at present levels and increase production to provide for the billions of people expected in a few short years is in jeopardy. Change—pest resistance, new and improved technologies, etc.—must be expected. Outstanding technologies, promoted and adopted with a short-term profit perspective will quickly fail. Without government investment in research and extension programs (USDA and land grant universities) the balanced, unbiased, public-sector voice will become increasingly silent, to the peril of American and southern farmers, and the world’s ever-growing human population.