Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

9.1 Introduction

Malaria is the most significant parasitic disease of human beings and remains a major cause of morbidity, anemia, and mortality worldwide. Malaria currently accounts for approximately 200 million morbid episodes and 2–3 million deaths each year, estimates that have been increasing over the last three decades [1]. The disease is caused by protozoan organisms of the genus Plasmodium, which invade and replicate within red blood cells (RBCs), a process resulting in the manifestations of disease, including cyclical fevers, anemia, convulsions, and death. The parasite is transmitted from person to person by biting anopheline mosquitoes. There are four malaria species that infect humans: Plasmodium falciparum, Plasmodium vivax, Plasmodium malariae, and Plasmodium ovale. They are distributed in varying degrees throughout the tropical world and in some more temperate areas, wherever ecological and sociological conditions favor sufficient interactions between humans, mosquitoes, and parasites to maintain transmission. It is, however, important to acknowledge that the majority of acute morbidity and mortality is caused by P. falciparum, and that nearly 90% of all cases and fatalities occur in sub-Saharan Africa. Although all persons are at risk for malaria, in many settings the burden of disease is carried primarily by children below the age of 5 and by pregnant women.

Malaria is a treatable infection, and a variety of antimalarial drugs are available. However, drug resistance has become a major problem, and new effective compounds are needed. Prevention of malaria has focused on reduction of man—mosquito contact by application of insecticides, use of bed nets, and environmental management to reduce mosquito-breeding areas. The development of a malaria vaccine is currently a major focus of research. Clearly, additional low-cost and effective means to assist in the prevention and treatment of malaria are needed.

It has long been acknowledged that populations residing in malarious areas generally live under conditions leading to poor nutritional status. The groups at highest risk for the adverse effects of malaria, children and pregnant women, are also most affected by poor nutrition. Although it has been suspected that nutrition might influence susceptibility to infection by the malaria parasite or modify the course of disease, there have been comparatively few efforts to examine such interactions. Among the studies that have been done, early ones suggested that poor nutritional status was actually protective.

However, modern and more recent works indicated that macronutrient and certain forms of micronutrient malnutrition exacerbate malaria morbidity and mortality [2]. Indeed, several field trials of nutritional supplementation have begun to clarify which nutrients can significantly reduce the burden of malarial disease and which might be exacerbative. What is clear is that nutrition strongly influences the disease burden of malaria, and that malaria itself has a profound effect on host nutritional status. This chapter describes the history and current state of knowledge of malaria and nutrition and develops a rational paradigm for the development of targeted, nutrient-based interventions as adjuncts to current methods of malaria treatment and prevention.

9.2 Historical Background

9.2.1 Historical Overview of Malaria

Human beings have long been afflicted with malaria. Medical writings dating from 2700 BC in China and India described what is most likely malaria, and the disease is described in writings of Homer [3]. Indeed, the Greeks had known the relation of fever to swamps and low-lying water since the 6th century BC, and Roman efforts to drain large areas of swampland were partially motivated by the desire to reduce malaria.

Effective treatment for malaria was not recognized in the West until the early 1600s, after Jesuit priests in Peru observed Amerindians treating cyclical fevers with a tea made from the bark of the cinchona tree. By 1820, the active ingredient from the bark, quinine, had been isolated, and its use for cyclic fevers became widespread. The malaria parasite itself was discovered in the blood of humans by Laveran in 1880. In 1898, Ross, working in India, and Grassi and colleagues, working in Italy, independently described the life cycle of malaria in birds and humans, respectively. Recognition of the role of mosquitoes in transmission led to efforts to reduce mosquito breeding through drainage and environmental control and reduction in human—mosquito contact through clothing, repellents, and bed nets.

9.2.2 Attempts to Eradicate Malaria

As mentioned, malaria control had initially been limited to reduction of mosquito-breeding habitat, use of bed nets, and treatment of cases with quinine. However, two developments changed the strategies used to combat malaria. First, in the 1930s chloroquine was developed as a derivative of quinine that was cheaper, was more effective, and had fewer side effects. Second, in 1940, DDT was synthesized and realized to be the most potent insecticide available to humans, remaining active for several months after application. In 1955, the World Health Organization (WHO), in response to the optimism engendered by the potential of these discoveries, formulated a plan for worldwide malaria eradication. By the late 1950s, one of the most ambitious health campaigns in history had been launched [4].

Early efforts were enormously successful in some countries such as the United States, Italy, Malta, and Sri Lanka but met with limited success in others. Unfortunately, implementation was minimal in Africa, where disease burden was greatest. By the late 1960s, it was clear that eradication operations were untenable in many countries, and further compounding problems was the emergence of anopheline resistance to insecticides and parasite resistance to antimalarials [5, 6]. A rapid resurgence of malaria and widespread emergence of chloroquine-resistant malaria followed the subsequent decline of the eradication effort.

9.2.3 Modern Approaches to Malaria Control

By the mid-1970s, the focus of combating malaria had shifted to control rather than eradication. There was also a gradual resurgence in malaria research that was fueled by developments in molecular biology and immunology. Despite major advances in understanding the biology of the parasite, diagnostic techniques, vector control methods, and antimalarial drugs, malaria has steadily increased over the last three decades. The current WHO Global Malaria Control strategy is focused on four goals: (1) provide early diagnosis and prompt treatment; (2) plan and implement selective and sustainable preventive measures, including vector control; (3) provide early detection to contain or prevent epidemics; and (4) strengthen local capacities in basic/applied research to permit the regular assessment of a country's malaria situation, in particular the ecological, social, and economic determinants of the disease.

The WHO Roll Back Malaria ‘program initiated in 2000, and the ongoing funding from the Global Fund for HIV, Tuberculosis, and Malaria, exemplifies the renewed interest in controlling malaria and—determined not to repeat past mistakes—has genuinely prioritized on malaria in Africa. The WHO goals mentioned are being pursued through a variety of means, including provision of malaria treatment kits to village-based health workers or pharmacists and focused training and capacity building through networking. The most significant developments, however, have been the resurrection and improvement of an older technology, the bed net, and a focused effort on malaria vaccine development.

9.2.3.1 Insecticide-Treated Bed Nets

The use of curtains or fabric to shield sleeping persons from mosquitoes has been practiced since ancient times. Indeed, the use of mosquito nets was strongly advocated after Ross and Grassi had discovered that mosquitoes did indeed transmit malaria. Although useful if maintained and used properly, bed nets are easily rendered ineffective if holes develop in the netting or if improperly hung. In 1950. researchers observed that if bed nets were dipped in residual insecticides, mosquitoes flying against or landing on the net were exposed to lethal doses and died within minutes. Improved development has led to renewed interest in bed nets, specifically insecticide-treated nets (ITNs). Several trials [79] demonstrated that ITNs were very effective in reducing morbidity and mortality. Meta-analysis of these trials indicated overall reductions in malaria morbidity by 48% and mortality by 20–40% [10].

9.2.3.2 Vaccine Development

Rationale for the belief that a malaria vaccine could be developed was based on the clear epidemiological evidence that humans develop protective immunity against malaria when repeatedly exposed to the infection. In addition, the maturation of molecular biology and modern immunology provided novel tools and hope that malaria vaccines were feasible.

Initial studies focused heavily on the sporozoite, the stage inoculated to humans by the mosquito. Emphasis on the sporozoite as an immunological target for protection was motivated by observations made in the 1950s that humans could be protected against a challenge infection after exposure to the bite of irradiated malaria-infected mosquitoes [11, 12]. By the early 1980s, researchers at New York University had cloned the circumsporozoite protein (CSP), the predominant protein on the surface of the sporozoite and the first gene to be cloned from a human parasite. Following demonstration that rodents could be immunized with recombinant CSP, there was considerable optimism that a vaccine would soon be available. Unfortunately, phase II trials in humans, including experimental challenge infections [13], failed to demonstrate adequate protection. However, research continued, and initial results with newer vaccine formulations based on the CSP, such as RTS,S/AS02A, have been promising [14, 15] and are reviewed elsewhere [16].

By the late 1980s, the attention of malaria vaccine development had begun to shift to the blood stage of the parasite, with the rationale that the erythrocytic phase was responsible for clinical disease and mortality. Further impetus for this approach came from experiments in 1963 [17], demonstrating that infusion of antibody from resistant adults into children suffering acute clinical malaria resulted in rapid clearance of parasitemia. By the late 1980s, Pattaroyo and colleagues in Colombia developed a synthetic vaccine known as SPf66, which was based on several different blood stage antigens shown to be protective against experimental P. falciparum infection of monkeys [18]. SPf66 became the first vaccine to be tested in large-scale field trials. Although initial results of 30–70% efficacy in South America [19] and Tanzania [20] were promising, additional field trials in the Gambia [21], Thailand [22], and Tanzania [23] failed to demonstrate efficacy. Nevertheless, SPf66 demonstrated that a vaccine could be developed that could elicit partial protection in some settings. As such, development of blood stage vaccines has continued. and several candidates are undergoing field testing in Africa and Papua New Guinea [24] and are reviewed elsewhere [25].

There are currently at least 23 vaccine candidates in various stages of development. These include, but are not limited to, CSP, merozoite surface protein 1 (MSP-1), erythrocyte-binding antigen 175 (EBA 175), apical-merozoite antigen 1 (APA-1), gametocyte antigens (Pfs25), and pre-erythrocyte liver-stage antigen 3 (LSA-3). As insights are gained through ongoing immunological and field studies, it is anticipated that a vaccine with at least partial efficacy will be available within a decade.

9.3 Epidemiology

The starting point for understanding malaria as a disease, and the rationale behind control programs, must be an in-depth understanding of the intricate and often village-specific aspects of ecology, biology, and epidemiology of malaria. Such knowledge is also the basis for understanding nutritional modulation of malaria morbidity and mortality.

9.3.1 Geographic Distribution and Disease Burden

The relative importance of malaria varies greatly in different geographical areas of the world. As mentioned, nearly 90% of life-threatening P. falciparum-related disease continues to be in Africa, with the remaining 10% occurring primarily in Southeast Asia and India, followed by South America [1, 3, 26].

Both incidence and seriousness of disease define its public health significance. Plasmodium falciparum causes a variety of pathophysiological and potentially lethal conditions such as cerebral malaria and severe malaria anemia. Additional complications include splenomegaly and renal and pulmonary pathology. In much of tropical Africa, malaria is the leading disease burden on the population. In Ghana, for example, P. falciparum has long accounted for nearly 10% of all healthy life-years lost [27, 28], making it the greatest single health threat to the population. In many cases, malaria is a contributing factor to death even though the final cause may be attributed to another disease, such as diarrhea or pneumonia. Community-based intervention studies [29, 30] indicated that malaria may account for nearly half the under-5 mortality. When malaria is controlled, reductions in nonmalaria mortality also decrease. A report of the global burden of disease indicated that malaria was responsible for 18% of all childhood deaths, and 94% of all such deaths in Africa [31].

Plasmodium vivax, although not as overtly pathogenic as P. falciparum, continues to be a major cause of morbidity in parts of China, India, Southeast Asia, Polynesia, and South America. It is a significant cause of morbidity and anemia, and these effects may indirectly contribute to all-cause mortality. Likewise, P. malariae is not as pathogenic as P. falciparum, although an unusual and highly lethal nephrosis can occur. Plasmodium ovale is a relatively uncommon infection, and its contribution to overall malaria morbidity is not substantial.

In addition to the proximate effects of morbidity and death, malaria results in chronic effects of persistent anemia, long-term disability, poor educational and work performance, and the cost of coping with illness and death within the family and community. In many cases, severe anemia from malaria requires blood transfusion, and this in turn has been closely associated with transmission of HIV and with hepatitis B virus. Last, malaria during pregnancy, particularly with the first pregnancy, places the woman at special risk for severe anemia and death from malaria [32]. In addition, prenatal malaria can result in intrauterine growth retardation, low birth weight (LBW), premature delivery, fetal death, and miscarriage.

9.3.2 Life Cycle of the Malaria Parasite

The complex life cycle of Plasmodium spp. is given in Fig. 9.1. The parasite undergoes two developmental stages in the human host, resulting in asexual reproduction, and three in the mosquito, resulting from sexual reproduction. The parasite is transmitted to humans as a sporozoite in the saliva of an infected female anopheline mosquito taking a blood meal. Sporozoites enter the venous circulation through the capillary beds and invade liver cells within minutes. Over the next 5–15 days, the sporozoite replicates to produce about 40,000 daughter parasites, called merozoites. In the case of P. vivax and P. ovale, dormant forms known as hypnozoites sometimes develop in the liver cells, remaining viable for up to 50 years [33]. When released from liver cells, merozoites invade erythrocytes. These intraerythrocytic merozoites differentiate into trophozoites, which consume the intracellular hemoglobin and give rise to 6–24 daughter merozoites. The red cell eventually ruptures, releasing these merozoites to invade new erythrocytes and perpetuate the cycle.

Fig. 9.1.
figure 1_9

Life cycle of Plasmodium.

Each erythrocytic cycle requires 48 hours for P. falciparum, P. vivax, and P. ovale but 72 hours for P. malariae [34, 35]. The different species of parasites also have different preferences for certain erythrocytes. Plasmodium vivax and P. ovale prefer younger nucleated red cells known as reticulocytes, whereas P. malariae prefers older red cells.

Plasmodium falciparum has a marginal preference for younger cells but will readily infect all erythrocytes [34, 35]. In addition, P. falciparum-infected erythrocytes develop knob-like structures on their surface, which, along with several protein structures, lead to adherence of infected erythrocytes to the postcapillary venous endothelium [6, 36]. This results in sequestration of infected erythrocytes in the capillary beds, thereby preventing circulation of infected erythrocytes to the spleen, which is a major site of parasite removal. Moreover, rupture of infected erythrocytes occurs in an environment of tightly packed cells that facilitates reinvasion by daughter merozoites. For these reasons, as well as others, P. falciparum achieves the highest levels of erythrocytic infection and results in the greatest degree of pathology.

Within the erythrocyte, some merozoites differentiate into sexual forms known as macrogametocytes (female) and microgametocytes (male). When taken up in a blood meal, gametocytes emerge in the mosquito gut and begin sexual reproduction, leading to sporogonic development. The male and female gametes fuse, providing for genetic recombination, to form a zygote that transforms to the ookinete, which penetrates the gut wall and attaches to the epithelium. Over the next 7–12 days, the oocyst enlarges, forming about 10,000 sporozoites, and ruptures into the coelomic cavity. The sporozoites then migrate to the salivary glands and are ready be transmitted to the human host, thus completing the cycle.

The actual time required for replication in the mosquito depends on the species of parasite, mosquito, and particularly on the ambient temperature. Under optimal conditions at 30°C, P. falciparum requires 9 days, but at 20°C it takes 23 days [6]. With an average life span for most anophelines less than 3 weeks, ambient temperature is critical to transmission.

Once infected, the mosquito transmits sporozoites with each blood meal [6]. Fifty to sixty species of the genus Anopheles are known to be capable of transmitting malaria to humans. There is great variation in different species in their host feeding preferences and ability to support malaria replication. Anopheles gambiae is the most important vector in Africa and among the most efficient for malaria.

9.3.3 Classification Schemes of Malaria Endemnicity

The frequency of inoculation of the sporozoite into the human host is, of course, a primary determinant of infection. This is measured by determining the biting rate of anopheline mosquitoes and the proportion of mosquitoes that are infected with the parasite. The entomological inoculation rate (EIR) is the product of these numbers and represents the frequency at which the typical person is directly exposed to infection. The EIR ranges from a few bites a year to over 300 in high-transmission areas.

Malaria endemnicity has been classified into four broad categories based on percentages of children, ages 1–10, with enlarged spleens and the prevalence of circulating malaria parasites [37, 38]: (1) Holoendemic areas refer to areas with constantly high EIR and with prevalence and spleen rates exceeding 75%. Immunity typically develops rapidly such that by age 10, acute malaria morbidity and mortality is low. (2) Hyperendemic areas include regions with regular, often seasonal transmission, with spleen and parasite rates from 50% to 75%. Immunity is less developed, and adults also experience significant illness. (3) Mesoendemic areas have malaria transmission fairly regularly but at much lower levels, and spleen and parasite rates range from 10% to 50%. Such areas are prone to occasional epidemics involving those with little immunity, which may result in fairly high mortality. (4) Hypoendemic areas have limited malaria transmission. The population will have little or no immunity, and severe malaria epidemics involving all age groups can develop.

9.3.4 Clinical Disease

Disease processes in malaria result from the erythrocytic cycle of invasion and hemolysis. Along with the liberation of the merozoites, hemolysis releases several pyrogenic compounds from infected red cells; these stimulate the clinical paroxysms of fever and chills. In some patients, the erythrocytic cycle becomes synchronized so that virtually all merozoites are released every 48 or 72 hours. This accounts for the periodicity of these symptoms and is the basis for the older classification of malaria into tertian (every third day) and quartan (every fourth day) disease. However, clinical manifestations of infection, particularly in children, can range from the totally asymptomatic to severe disease and rapid death. Children may present with symptoms that include listlessness, pyrexia, abdominal cramping, difficulty breathing, mental disorientation, or convulsions.

The distinction between infection and disease is particularly important in malaria. Infection with the malaria parasite does not necessarily result in disease. In highly endemic areas, childhood prevalence rates exceed 50%, but few will have acute symptoms. Typically, in hyperendemic areas the prevalence and density of infection with P. falciparum peak in early childhood and decline thereafter, with density receding prior to prevalence. The density of infection at which symptoms appear is also greater in early life and becomes lower with age [39]. This indicates that separate immunological effectors modulate infection, parasite density, and febrile responses to the parasites.

It is also of interest that, even in areas of perennial transmission, the incidence of malarial disease is substantially greater with seasonal increases in transmission, suggesting that recent inoculation more frequently leads to disease [40]. Indeed, it appears that many individuals who become symptomatic have encountered a new parasite variant [41].

In the case of P. vivax and P. ovale, reactivation of dormant hypnozoites can result in long-term relapses. Although P. falciparum and P. malariae do not produce hypnozoites, untreated or inadequately treated infections may result in persistent low-grade parasitemia, leading to recrudescent disease [42].

9.3.5 Epidemiology of Severe Malaria

As the intensity of transmission increases, the proportion of severe malaria cases and mortality is concentrated in lower age groups. In high-transmission areas, severe malaria and death are generally restricted to those below 5 years of age, and most clinical disease occurs below 15 years of age [43, 44]. Severe malaria can include a variety of life-threatening manifestations but usually refers to either severe anemia or conditions with cerebral or neurological involvement [29, 43, 45, 46]. These are distinct clinical sequelae with different epidemiological patterns that are altered by transmission intensity.

In high-transmission areas (e.g., EIR > 100/year), severe anemia predominates in the youngest children from 6 to 24 months of age, whereas cerebral malaria is more common in older children from 36 to 48 months of age. In lower transmission areas (e.g., EIR < 10), severe disease is predominantly cerebral malaria in older children. Interestingly, despite the different manifestations, the number of children experiencing severe disease is similar [44, 47, 48].

These age-specific effects may be related to a combination of increased exposure as well as age-dependent maturation of the immune system [49]. In addition, constancy of transmission appears to influence severe disease. Areas of intense perennial transmission tend toward severe anemia, with cerebral malaria more frequent in areas of seasonal transmission [50]. There is also space-time clustering of severe malaria [51, 52], suggesting that severe malaria occurs in localized microepidemics.

9.3.6 Diagnosis and Drug Treatment of Malaria

9.3.6.1 Diagnosis

Diagnosis of malaria infection is generally made by microscopic detection of the parasite in a Geimsa-stained finger-prick blood sample smeared on a glass slide. A thick smear is usually viewed to detect and enumerate parasites, and a thin smear is used for speciation. The ability to detect low-grade infections depends on the number of fields examined and the experience of the technician viewing the slide. Although technologically simple, microscope-based diagnosis requires a trained technician to obtain reliable results.

Additional, although often more expensive, techniques have been developed that allow detection of the parasite by unskilled personnel [53]. Several of these tests, such as the ParaSight-F test, detect circulating malaria antigen in the blood and are available as a dipstick [54]. Others, such as the QBC column, concentrate parasites into a thin band in a centrifuged capillary tube, which is then viewed under a microscope.

9.3.6.2 Drug Treatments

In non-Western countries, the history of herbal drug use goes back hundreds of years before any effective drugs were available in Europe. In China, derivatives of the plant

Qing-hao-su have been in use for centuries, and pre-Columbian Peruvian healers used tea made from the bark of the cinchona tree. The active ingredient of the cinchona bark, quinine, was extracted in 1820 in Europe. Since then, several drugs have been developed and are in use throughout the world. While a brief synopsis of several drugs is provided here, refer to an informative review for extensive coverage of this topic [55].

Chloroquine, a quinine derivative, was invented in the late 1930s and is active against asexual stages of all human malaria except drug-resistant strains of P. falciparum. It is well tolerated, even when taken chronically for long periods, and is safe for young children and pregnant women. It has a prolonged half-life of 33 days, which facilitates its use as a prophylactic drug. The most important side effect is pruritis reported almost uniquely but frequently by black Africans. Owing to its low toxicity and cost and high effectiveness, chloroquine remains the first-line drug of choice in many areas. Additional drugs described next offer various advantages and disadvantages over chloroquine.

Quinine continues to be an important therapeutic agent, especially for drug-resistant P. falciparum malaria. However, it has a relatively short shelf life and has more adverse reactions than chloroquine.

Artesunate and related compounds, derived from Qing-hao-su, have demonstrated a high degree of effectiveness with relatively low rates of adverse reactions. The mechanism of action involves rapid oxidative damage of the parasite and is distinct from the mechanism of action of quinine and related compounds. A rapidly absorbed suppository formulation of artesunate for young children is also available. Though relatively new, resistance to this drug has been observed in some patients.

Pyrimethamine and proguanil are effective drugs, but resistance generally occurs relatively quickly.

Sulfadoxine-pyrimethamine (SP), originally developed for efficacy against chloro-quine-resistant P. falciparum, is widely used in areas of drug resistance and is a folate antagonist. However, prophylactic use has been associated with severe and often fatal adverse reactions. Increasing reports of SP-resistant P. falciparum are further limiting usefulness [26].

Mefloquine, another quinine derivative, is widely used for prophylaxis in areas with chloroquine resistance. It is more expensive and has more adverse reactions than chlo-roquine.

Primaquine is unique in that it has activity against all forms of the malaria parasite occurring in humans. Because of its effect on sporozoites and the hepatic forms, it can be used to prevent the establishment of infection in the liver. However, primaquine also causes hemolysis in those with glucose-6-phosphate dehydrogenase (G6PD) deficiency, a common trait in Africans.

Note that recent developments in chemotherapeutic agents for malaria involve multiple-drug therapy. It is believed that this provides optimal therapy because drugs can be combined that affect different aspects of parasite metabolism. Moreover, this limits emergence of drug-resistant parasites. Various combinations currently in use include artesunate—mefloquine and artesunate—sulfadoxine—pyrimethamine.

9.3.7 Host—Parasite Interactions and Immunity

Malaria is a very intense stimulator of the human immune system. Many biological defense systems are activated in response to malaria infections, including the reticuloendothelial system, with great enhancement of phagocytic activities in the spleen, lymph nodes, and liver to remove altered RBCs and other debris, an intense activation of humoral defenses. Indeed, humans can develop several grams per liter of immunoglobulin directed against malaria and a great range of cellular immune and cytokine cascade defenses. Some of these responses are protective, and others may contribute to the pathology.

Despite this intense immune activation, or perhaps because of it, humans do not develop complete immunity. Over a period of years of exposure to the parasite, antidis-ease immunity may develop in persons living in endemic areas. This is characterized as asymptomatic infections with relatively low levels of circulating parasites. It is currently thought that antibody, and possibly cell-mediated, responses are responsible for this. Individuals may also develop anti-infection immunity that results in decreased presence of new infections. In this case, cell-mediated immunity against the stages that infect liver cells is thought to be the primary effector. It is noteworthy that the exposure-induced immunity is species, and often strain, specific and fades within months if exposure to the parasite is interrupted. While immunity is poorly understood, as mentioned above antibodies to certain parasite proteins have been associated with protection against blood-stage parasites. Moreover, antibody against toxic factors produced by the parasite, such as the glycosylphosphatidyl inositol moiety, have been associated with protection from clinical illness.

As mentioned, P. falciparum-infected red cells undergo complex changes, including the expression of protein knobs on the red cell surface [6, 36]. These proteins are highly variable in their antigenic expression and are the products of highly variable “var” genes. Up to 150 different var genes have been identified, which occupy up to 5% of the malaria genome [6]. This commitment to antigenic diversity of the knobs reflects the key role they play in the survival of P. falciparum. The var gene proteins also confer adherence to noninfected red cells, a phenomenon known as rosetting. These processes facilitate sequestration of the parasite in the microvasculature and may facilitate maturation by keeping parasites in their preferred less-oxygenated blood.

Proteins expressed on vascular endothelium serve as receptors for attachment of infected erythrocytes. Immune mediators such as tumor necrosis factor-α (TNF-α) released during infection may activate some of these. Much work has gone into efforts to relate specific phenotypic parasitic characteristics—such as the capacity to stimulate the release of cytokines including TNF-α, the production of nitric oxide, and the cytoadherence and rosetting phenotypes (and their underlying genotypes)—to the diversity of clinical events provoked by P. falciparum that lead to severe disease [5659]. Although the rosetting phenotype has been reported as associated with cerebral malaria, no clear association so far has been found between any particular cytoadherance phenotype and cerebral malaria or other forms of severe malaria [44, 60]. There is also evidence that parasites stimulate TNF-α, but thus far no pattern has been consistently discerned. What is clear is that great antigenic diversity exists within any given P. falciparum population within humans or within vectors [6163].

These factors underscore the discussed challenges with development of a vaccine against malaria. The relative importance of antibody- and cell-mediated immunity is unclear, but both seem to be involved. Moreover, as mentioned, malaria can vary some of its surface antigens, and different strains of the protozoa have different antigens.

It is anticipated that an effective vaccine against malaria would need to induce both humoral and cellular immunity, possibly against multiple stages of the parasite.

9.3.8 Modulating Factors of Malaria Morbidity and Mortality

The formation of small towns arising in concert with agricultural development in much of Africa, Asia, and South America has facilitated malaria transmission by concentrating populations in relatively confined areas near water supplies. The development of small dams and irrigation schemes has also added to anopheline breeding. In addition, the agricultural use of pesticides has been a factor in the development and spread of insecticide-resistant vectors.

From the biomedical perspective, the heavy morbidity and mortality caused by malaria has led to the selection of multiple genetic traits that confer some degree of protection. Most of these involve the red cell and include structural variants in hemoglobin. Variants in the β-globin chain of hemoglobin include hemoglobin S (sickle-cell trait), and hemoglobins C and E. Altered α- and β-chain production results in α- and β-thalassemias. Others traits include erythrocyte enzyme deficiency of G6PD, red cell cytoskeletal abnormalities such as ovalocytosis, and loss of red cell membrane proteins such as Duffy blood group factor [6466]. In some cases, selection for polymorphisms of immunological effectors, such as TNF-α, has occurred [63]. Most of these genetic polymorphisms are somewhat deleterious to overall health but provide survival advantage in malarious areas.

9.4 Effects of Nutrition on Malaria

9.4.1 Early Perceptions of the Impact of Nutrition on Malaria

Prior to 1950, it was widely accepted that malnutrition led to greater susceptibility to malaria. The Indian Famine Commission in 1898 reported that malaria was more frequent and fatal in those suffering from poor diets [67]. Likewise, historical accounts from the late 19th and early 20th centuries indicated that famines and poor economic conditions in north India and Sri Lanka tended to precipitate malaria epidemics, and that the poorer classes experienced greater mortality [68, 69]. Several reports from 1920 to 1940 in Corsica [70], Algeria [70], Vietnam [71], Turkey [72], and Ghana [73] stated that malaria was more frequent and severe among groups and individuals who were undernourished. In 1954, Garnham, a prominent malariologist of his time, stated that in Africa the clinical effects and mortality of malaria were more severe when superimposed on malnourished children [74]. Still, there were reports to the contrary. In 1897, an Italian industrialist unsuccessfully attempted to exploit the fertile but malaria-infested Pontine marshes near Rome by protecting farmers with generous provisions of food and quinine [75]. Some claimed there was no association between nutritional status and malaria morbidity [76], whereas others recounted that increases in food consumption following famines actually exacerbated malaria [77].

Unfortunately, most of these reports were based on qualitative clinical and epidemiological observations or even anecdotal information. Little, if any, quantitative data or methodological information was published to substantiate the conclusions. By the early 1950s, however, clinicians and malariologists began making attempts to quantify more carefully the interactions between nutrition and malaria. Three studies from Ghana and Nigeria published between 1954 and 1971 [7880] were particularly influential and strongly promoted the notion that malnutrition was in fact protective for malaria. This idea was reinforced by a series of studies by Murray et al. [8184] from 1975 to 1980 on refeeding and malaria in famine victims from Niger and Sudan. Animal studies appeared to support these reported malaria-suppressive effects of a poor diet, leading to the perception that malnourished children were less susceptible to malaria infection, morbidity, and mortality [8588]. The following section of the chapter critically assesses these studies, reviews more recent data from humans, and reexamines data from the animal studies.

9.4.2 Malnutrition and Malaria: Synergism or Antagonism?

9.4.2.1 Malnourished Individuals and Malaria Morbidity and Mortality

Several studies have examined the association between malnutrition, usually protein-energy malnutrition (PEM), and malaria morbidity and mortality. These are presented in Tables 9.1 and 9.2. Some studies were based on clinic outpatients, whereas others were hospital admissions or community-based cross-sectional studies. Most were case-control studies, but some were longitudinal surveillance of cohorts. Studies with results consistent with the idea that malnutrition exacerbates malaria are labeled as synergistic, whereas those indicating malnutrition protects against malaria are referred to as antagonistic.

Table 9.1 Interaction between malnutrition and Plasmodium falciparum (Pf): clinic and hospital-based studies
9.4.2.1.1 Clinic-Based Studies

Among the first studies of nutrition and malaria was a large-scale clinic-based study in Uganda [89, 90] (Table 9.1), which concluded that no association existed between nutritional status and malaria mortality. This study was less than ideal as nutritional status was based on qualitative and subjective indicators such as thin or pale hair or being “very” thin, and malaria diagnosis was based only on spleen enlargement, whereas m[ortality risk was estimated by presence or absence of sibling mortality. Another smaller clinic-based study in India reported progressively increasing parasite density with improved nutritional status [91], suggesting that malnutrition was protective. However, two additional studies, one in Brazil [92] and one done on Soviet army personnel [93], reported greater frequency or more severe malaria in those who were malnourished.

9.4.2.1.2 Early Hospital-Based Studies of Severe Malaria

As mentioned, early hospital-based studies strongly influenced current perceptions of malaria and malnutrition. First, in 1954, an autopsy report by Edington [78] indicated that four Ghanaian children who died from cerebral malaria were all well nourished. Other accounts from South Africa in 1960 [94] reported that malaria was rarely seen in malnourished children. This was followed by another qualitative report from Edington (1967) [95] in Nigeria stating that children dying of cerebral malaria were usually well nourished, and that cerebral malaria was rare in children suffering from kwashiorkor. Case-control studies from Nigeria by Hendrickse in 1967 [79] and 1971 [80] concluded that children suffering from malaria were less likely to be malnourished or have convulsions. Hendrickse also reaffirmed the apparent protection owing to kwashiorkor [79]. A subsequent autopsy report from Nigeria of 25 malnourished children indicated that only 2 had died of malaria [96].

9.4.2.1.3 Critical Analysis of the Early Hospital-Based Studies.

Although these reports appeared convincing of a protective effect of malnutrition on malaria, several characteristics of the studies weaken such conclusions. Most important, the study populations were comprised of clinic cases or malnourished children, and comparisons were made between those with or without malaria. In the absence of healthy community controls, one can only conclude that malaria is less exacerbated by malnutrition than other conditions. The overall prevalence of malnutrition among malaria cases in these studies was remarkably high, suggesting possible synergy with malnutrition. A more informative analysis would have included the relationship between the degree of malnutrition among malaria cases and the risk of malaria mortality. Unfortunately, such analyses were not done, partly because relatively nonstandardized qualitative descriptors rather than quantitative assessments were used to categorize malnutrition. In some studies, incomplete analyses of the existing data were made. For example, Hendrickse reported decreased risk of convulsions in malnourished malaria patients, but the same decline in convulsion risk was also observed for malnourished nonmalaria patients [97]. Additional caveats reside in the lack of information on socioeconomic status (SES) or residence of the cases. Well-nourished cases may tend to come from the urban areas and have less acquired immunity, whereas malnourished cases could come from outlying higher-transmission areas, leading to greater immunity. Indeed, Edington reported that the children suffering from cerebral malaria tended to have less hookworm [95], an observation possibly related to SES. Last, the conclusions based on patients suffering from kwashiorkor may not be generalizable to overall malnutrition because aflatoxins, a causative agent of kwashiorkor, are toxic to malaria parasites in vitro and in vivo [97, 98].

9.4.2.1.4 Modern Hospital-Based Studies of Severe Malaria.

In the last 15 years,additional studies have been completed on the relationship between malnutrition and malaria. These have tended to be larger studies that more carefully documented nutritional status by reference standards using height, weight, and age and have evaluated malnutrition as a risk factor for malaria mortality among hospital admissions. Studies (Table 9.1) conducted in Madagascar [99, 100], Nigeria [101], Senegal [102], Chad [103], the Gambia [104], and Ghana [105] indicated that malnourished patients are 1.3–3.5 times more likely to die or suffer permanent neurological sequelae compared to normally nourished malaria patients. In addition, the study from the Gambia indicated that malaria patients typically weighed 350 g less than healthy control children [104]. A recent study from Kenya among hospital admissions for severe malaria anemia indicated that wasting was strongly associated with this outcome [106].

It is also important to note that, in many of these studies, as seen by Hendrickse, malnourished hospital patients were less likely to be suffering from malaria as compared to other conditions. This is consistent with the notion that malaria may be exacerbated by malnutrition to a lesser degree than other diseases. Indeed, additional analyses in the Gambia study [104] confirmed the greater impact of malnutrition on risk of death from diarrhea and pneumonia.

In contrast to these reports, a study of 60 hospital patients in India indicated that para-sitemia tended to increase with improving nutritional status; however, no data on clinical outcomes were presented [107]. A more recent hospital-based case-control study from Thailand indicated that adult patients suffering from cerebral malaria were more likely to be well nourished than those admitted for nonsevere malaria [108]. Unfortunately, because the controls were not systematically selected from the same geographical area as the cases, differences in parasite strains or other factors could not be excluded as contributors to this finding. Alternatively, one interpretation may be that the relationship between malnutrition and malaria is modulated by the local epidemiology of malaria and perhaps malnutrition. As such, malnutrition could possibly be protective in some areas and conditions and exacerbative in others. However, considering the available data in toto, it would appear that malnutrition considerably increases the risk of severe malaria morbidity and death.

It should also be appreciated that certain age-dependent relationships may underlie some inconsistencies linking malnutrition and malaria in humans. Severe clinical malaria is more frequent in young children (i.e., < 3 years) and tends toward severe malaria anemia. In contrast, for older children (i.e., > 3 years), cerebral malaria is more frequently encountered. Likewise, various indicators of nutritional status have age-dependent distributions. Wasting (low weight-for-height) is seen more frequently in young children than in older children, for whom stunting (low height-for-age) is generally more prevalent. Thus, if such age-based differences were not taken into account, a clinic or hospital-based study of severe malaria cases could erroneously conclude that wasting was protective for cerebral malaria.

Additional study of the epidemiology of hospital-based severe malaria would be useful to examine confounding factors such as age, SES, immunity and disease ecology.

9.4.2.1.5 Cross-Sectional Studies of Malariometric Indicators.

Several cross-sectional surveys (Table 9.2) also favor a synergistic relationship between malnutrition and malaria. Studies carried out in Malawi [109], Zambia [110], Papua New Guinea [111], Sudan [112], Chad [113], Zaire [114], and Tanzania [115] indicated greater risk for infection [109, 110, 113, 114], malaria illness [112, 115], or spleen enlargement [111] among malnourished children. A study in Colombia indicated that malnourished children had lower antimalaria antibody levels [116]. This could be interpreted as a synergistic effect if malnutrition suppresses antibody response to malaria or possibly as antagonistic if malnutrition protects against infection, thereby precluding formation of parasite-specific antibody. In Tanzania, there was no effect of nutritional status on antiparasite antibody levels [117], and a study in Burkina Faso [118] found no association between malaria infection and nutritional status.

Several additional reports indicated significant associations between malnutrition and malaria. A survey of more than 4,000 children in northern Ghana indicated that malnutrition was associated with clinical malaria (odds ratio [OR] 1.67; 95% confidence interval [CI] 1.10–2.50) [119]. Similarly, a survey in Tanzania, where more than 50% of the infants are infected with P. falciparum, indicated that stunting was associated with malaria [120], and in Kenya stunted preschool children were about 1.5–2.5 times more likely to be infected and to have high-density parasitemia, clinical malaria, or severe malarial anemia [121].

9.4.2.1.6 Longitudinal Cohort Studies and Effects of Nutrition on Drug-Resistant Malaria.

Last, longitudinal cohort studies in Tanzania [122], Vanuatu [123], and Congo [124] indicated that malnutrition predisposes children to malaria illness. Another longitudinal study from the Gambia also indicated that stunted, but not underweight or wasted, preschool children were at increased risk for malaria episodes [125]. In contrast, an earlier report from the Gambia [126] indicated little effect of malnutrition on malaria attacks. Another report from Papua New Guinea suggested that stunted children may be more resistant to malaria attacks [127], although this protection was not seen in underweight children. In this study, it is of interest that the stunted children also exhibited increased immune responses to malaria antigens, suggesting increased exposure, whereas the wasted children had suppressed responses. One additional semilongitudinal study (multiple cross-sectional cohort surveys) in Zaire [128] reported no association between nutritional status and malaria mortality. It should, however, be noted that no causes of death were actually established, and the authors assumed that deaths were due to malaria. Another longitudinal follow-up study of daily active case detection from Burkina Faso found no association between malnutrition and clinical malaria [129]. The highly sensitive surveillance methodology used in this study would be biased toward detecting differences in the onset of a malaria episode and not necessarily the subsequent severity. Still, it may be of interest that the cross-sectional study from Burkina Faso mentioned also found no association between malaria and malnutrition.

An interesting study was published that estimated the proportion of global malaria morbidity and mortality due to malnutrition [130]. For malaria morbidity, the authors pooled longitudinal follow-up data from two [123, 126] field studies and interpreted the effects in the context of the global prevalence of underweight children. A similar process was performed for malaria deaths as determined by verbal autopsy from three studies [131133]. The authors estimated that 8.2% of global malaria morbidity and 57% of malaria mortality were due to children being underweight. Despite the caveats of limited data, fusing morbidity data for P. falciparum and P. vivax, and reliance on verbal autopsies, the results provide a general estimate of the overall impact of malnutrition on malaria.

Additional evidence of an exacerbative role of malnutrition on malaria can be seen in several longitudinal drug resistance studies. Malnourished Rwandan refugees had slower parasite clearance, higher parasite titers at presentation, and more severe drug resistance [134]. Likewise, in the Solomon Islands, malnourished children were 3.6 times more likely to have drug-resistant malaria [135, 136]. More recent work in Malawi showed that severely malnourished children were three times more likely to experience treatment failures than those better nourished [137].

9.4.2.2 Studies in Famine Relief

The Murray family examined the presence of malaria in famine victims during nutritional rehabilitation in a series of studies. During the Sahelian famine in Niger, victims were admitted to a hospital for refeeding, and it was observed that P. falciparum malaria developed in many of these individuals within a few days [82], often resulting in cerebral pathology. Because there was no transmission of malaria at the hospital, it was believed that feeding had provided essential nutrients for sequestered parasites, leading to recrudescent infection [81, 82]. In another study, famine victims were given either grain or milk for rehabilitation, and it was observed that those given grain were more likely to experience recrudescent cerebral malaria [83]. These studies suggested that quality as well as quantity of the diet is an important determinant of malaria morbidity. The previously mentioned 1945 report of the Bengal Famine Commission also stated that refeeding tended to precipitate malaria disease in those carrying low-grade infections [77]. The Murrays concluded that the interaction between poor diet and malaria is part of an ecological balance between humans and malaria, which was interpreted as a beneficial aspect of malnutrition.

9.4.2.3 Studies in Animals

A variety of animal experiments have also contributed to the idea that PEM reduces malaria morbidity. Early work showed that monkeys maintained on a low-protein diet did indeed have lower parasitemia [138141]. However, the animals either were unable to clear the infection, resulting in multiple recrudescences [139], or parasitemia appeared earlier and lasted longer [141]. Immune responses were also suppressed [141]. However, for monkeys suffering from cerebral malaria, protein-deprived animals had fewer parasitized erythrocytes in the cerebral capillaries and did not develop the disrupted endothelium seen in normally fed monkeys. Still, cerebral and pulmonary edema was present in all animals irrespective of dietary regimen [142].

The primate experiments were complemented by a variety of informative data from studies of rodent malaria. A comprehensive series of investigations by Ramakrishnan et al. in the early 1950s indicated that malaria parasitemia was less severe in protein-deprived rats, and that survival was enhanced [143147]. He also showed that methionine and para-amino benzoic acid (PABA) promoted infection in starving rats [143]. Importantly, it was also clear that protein-deprived animals were unable to clear the infection [147], and that protein restriction in young rats exacerbated malaria parasitemia and mortality [144]. Moreover, parasite densities were higher and more lethal during relapses in protein-deprived animals [147]. Last, starved animals experienced strong relapse infections when food was given [144].

Additional studies by Edirisinghe et al. [87, 148151] documented that acute and chronic protein deprivation depressed peak parasitemia more than 75% and prevented death. However, as shown in previous work, the animals were unable to clear the infection [150], and antibodies preventing parasite growth did not adequately develop. Elegant work by Fern et al. [151] then demonstrated that readdition of threonine to a low-protein diet restored susceptibility, and that this effect was enhanced by valine, isoleucine, and methionine. However, phenylalanine, tyrosine, lysine, histidine, and tryptophan did not appear to have this promoting effect.

Subsequent studies in rats and mice confirmed that low-protein diets suppressed parasitemia [152156] and inhibited cell-mediated immunity [152, 153], and that effects were reversible by addition of PABA [153]. Effects on mortality were, however, less consistent. In some cases, low-protein diets suppressed parasitemia, but mortality was higher, albeit delayed [157]. Addition of threonine and methionine to the low-protein diet decreased mortality [157], although methionine and threonine alone had no effect when added to the deficient diet. Others observed no effect on mortality in moderately malnourished mice, but increased death was seen in severely malnourished animals [158]. Protein-deficient diets were, however, consistently protective for rodent cerebral malaria [155, 156, 159].

9.4.2.4 Synthesis of Data Concerning Effects of Protein-Energy Malnutrition on Malaria

The considerable body of data from humans and animals, though complex, provides ample evidence to draw some conclusions regarding the interaction between malnutrition and malaria. Although it has been frequently mentioned that malnutrition is protective for malaria [8588], modern and recent studies along with careful reexamination of the older human studies and data from animals indicate that malnutrition does indeed exacerbate malaria and considerably increases the likelihood of mortality.

The human hospital-based studies suggesting a protective effect of malnutrition are inconclusive owing to the many methodological and design issues mentioned. Similarly, the animal-based data, often cited as supportive evidence that malnutrition is protective, are not so clear when carefully examined. Closer inspection reveals that although parasitemia tends to be lower in poorly fed animals, they are unable to clear the infection, and immune responses to the parasite are suppressed. This leads to more chronic infections and more severe relapses. Also, the observation that malnutrition is particularly deleterious for malaria in younger animals is important. In cerebral malaria, poor diets appear protective for animals, but human data indicate that malnourished children are more likely to die from cerebral malaria. This discrepancy may be rooted in differences in the etiology of cerebral pathology in animals and humans.

The famine or starvation situation is, however, a special case, and it is consistently observed in humans and animals that refeeding an infected starved host reactivates low-grade infections. The implication is that antimalarial measures should be included during nutritional rehabilitation of famine victims.

9.4.3 Impact of Malaria on Growth

Although malnutrition appears to exacerbate malaria, it is also true that malaria itself results in growth failure and is a contributing factor to malnutrition. Several reports from Africa have noted a transient weight loss in young African children following a malaria attack [160162]. In the Gambia, two longitudinal cohort studies indicated that P. falciparum malaria was significantly related to lower weight gain and growth faltering, particularly in children below 36 months of age [163, 164]. Other studies have attempted to compare weights in different communities with different levels of malaria. In El Salvador, no differences in weight or height were observed in areas with low or high transmission of P. vivax [165]. In contrast, researchers in Papua New Guinea found significantly greater malnourished individuals in villages with high P. falciparum transmission intensity compared to control villages with lower transmission [111], and longitudinal follow-up of preschool children in Kenya indicated that P. falciparum infection increased the risk of being underweight or stunted, particularly in children up to 2 years of age [166]. An interesting recent longitudinal study of male adolescents and adults in Kenya indicated that malaria infection and related production of TNF-α, a proinflammatory cytokine, were associated with reduced weight gain [167].

Additional studies of chemoprophylaxis provided more definitive evidence for the effects of malaria on growth. A 1-year placebo-controlled trial of pyrimethamine prophylaxis in 176 Ghanaian 7-year old children resulted in a nonsignificant 78 g excess weight gain in those taking pyrimethanine [168]. In a small 2-year study in the Gambia [169] of 52 children randomized to chloroquine prophylaxis or placebo at birth, unprotected children weighed significantly less between 6 and 24 months. Another 2-year study in Nigeria that followed 198 children given chloroquine prophylaxis or a placebo shortly after birth found that protected children tended to have greater height, weight, mid-upper arm circumference (MUAC), and mean serum albumin levels, although the differences were relatively small [170]. Interestingly only 1 child given chloroquine died from malnutrition, compared to 6 such deaths in the control group.

Moderate effects on nutritional status were also observed following other malaria interventions. The Garki Project (1980) examined the effects of prolonged and large-scale insecticide spraying and chemoprophylaxis on nutritional parameters in a northern Nigerian community and observed small but significant changes in weight gain and MUAC [171]. However, similar interventions in Tanzania found no effect in 2- to 18-month-old children after malaria control [172] Snow et al. in Kenya followed 1,500 children 1–11 months of age, half of whom slept under insecticide-treated bed nets [173]. The number of children classified as malnourished was 25% less in those using ITNs, and MUACs were also increased. A similar study of bed nets and treatment on demand in Tanzania found that protected children gained more weight, with the strongest effects seen in those less than 18 months of age [174]. Likewise, bed nets were associated with a 1.2% increase in percent lean body mass in Kenyan children less than 13 years of age [175]. A study from Vietnam in which multiple annual anthropometric surveys were performed along with integrated malaria control measures over a 4- to 5-year period revealed an annual increase of height-for-age Z-score of 0.11–0.14 extending into preadolescent age [176].

9.4.4 Influence of Specific Nutrients on Malaria Morbidity

9.4.4.1 Iron

Iron deficiency affects nearly 2 billion people worldwide, resulting in over 500 million cases of anemia [177]. Additional sequelae include poor neurological development, lower work capacity, LBW, and increased maternal and infant mortality [178, 179]. The burden of both iron deficiency and malaria falls primarily on preschool children and pregnant women [180, 181], and iron supplementation of these groups is the primary means of prevention and treatment of anemia. Multiple studies have attempted to evaluate the benefit of iron supplementation in malaria endemic areas [84, 182202]. Some of these studies reported that iron supplementation increased the risk of developing or reactivating malarial illness [84, 182, 185], whereas others reported no significant adverse effects [190, 193, 203]. To resolve this issue, a systematic review and meta-analysis of controlled trials of iron supplementation was completed [204]. A search produced 13 trials [84, 182193], totaling 5,230 subjects, from which data were pooled to obtain composite effects of iron supplements on malaria attack rates, parasite prevalence, parasite density, prevalence of enlarged spleens, hemoglobin levels, and anemia.

Iron supplementation resulted in a nonsignificant 9% (relative risk [RR] 1.09, 95% CI 0.92–1.30, n = 8) increase in the risk of a malaria attack. End-of-trial cross-sectional data indicated a 17% (RR 1.17, 95% CI 1.08–1.25, n = 13) greater risk of infection in those given iron. Qualitative assessment of parasite density suggested a tendency toward higher levels in those receiving iron. However, hemoglobin levels improved by 1.2 g/dL (95% CI 1.2–1.3, n = 11) following iron supplementation, and the risk of anemia was reduced by 50% (RR 0.50, 95% CI 0.45–0.54, n = 4). Thus, there were substantial benefits as well.

Although the meta-analysis indicated exacerbative effects of iron on malaria and reductions in anemia, it was unable to address the issue of severe malaria. However, a large-scale trial in Tanzanian preschool children indicated that daily oral iron supplementation at current WHO recommended doses resulted in an 11% (1–23%, p = .03) increased risk of all-cause hospitalization and a nonsignificant 15% (−7 to 41, p = .19) increase in all-cause mortality [205]. These effects were not observed in a similar trial in Nepal, where malaria transmission is relatively low [206], suggesting a substantial role for P. falciparum in the excess morbidity and mortality. It should be noted that the children received folic acid along with the iron, and this may also have influenced the results. However, these data are consistent with other recent studies indicating adverse effects of iron supplementation on P. vivax [207, 208] and a report from Kenya indicating that iron-deficient children were less likely to experience P. falciparum morbidity [209].

The mechanisms for this effect on malaria remain unknown. However, recent discoveries indicated immunomodulatory effects of non-transferrin-bound iron (NTBI) [210]. Specifically, NTBI increased the expression of adhesion molecules, intracellular adhesion molecule 1 (ICAM-1), vascular cell adhesion molecule 1 (VCAM-1), and E-selectin, thereby indicating a role in adhesion-mediated processes that have been implicated in the pathology of P. falciparum malaria. In addition, it has been demonstrated that anemic erythrocytes containing elevated zinc protoporphyrin IX can inhibit parasite replication [211]. Iron supplementation may therefore lead to conditions that are more favorable for parasite growth. Additional pathways may involve the recently discovered iron-regulatory peptide known as hepcidin [212].

As a body of evidence, current data indicate that routine supplementation with iron in preschool children in areas with malaria can result in an increased risk of severe illness and death. As such, current guidelines for universal supplementation with iron may need to be revisited. If iron supplementation is considered, co-implementation with antimalaria activities would be strongly warranted. The role of iron-fortified foods or food additives for improving iron status should also be evaluated for potential adverse effects on malaria.

9.4.4.2 Zinc

Zinc is essential for normal immune function [213] and has been shown to reduce the incidence of diarrhea and pneumonia [214]. Indeed, zinc is essential for a variety of lymphocyte functions implicated in resistance to malaria, including production of immunoglobulin G (IgG), interferon-γ (IFN-γ), and TNF-α and microbicidal activity of macrophages [213, 215].

Cross-sectional studies among school-aged children in Papua New Guinea [216] and pregnant women in Malawi [217] have reported inverse associations between measures of zinc status and P. falciparum parasitemia. In addition, a placebo-controlled trial of zinc supplementation in preschool children in the Gambia documented a 30% reduction in health center attendance owing to P. falciparum [218], although this was not statistically significant. Last, mildly zinc-deficient mice experienced mortality from a normally nonlethal strain of P. yoelii [219], and zinc supplements decreased markers of oxidative stress during infection with P. berghei [220].

A placebo-controlled trial of zinc supplementation of preschool children in Papua New Guinea provided additional evidence for the role of zinc in malaria. The study indicated that zinc supplementation reduced by 38% (95% CI 3–60, p = .037) the frequency of health center attendance owing to P. falciparum malaria. Moreover, a 69% (95% CI 25–87, p = .009) reduction was observed for malaria episodes accompanied by high levels of parasitemia (i.e., ≥ 100,000 parasites/μL), suggesting that zinc may preferentially protect against more severe malaria episodes [221]. A subsequent trial of daily zinc supplementation of preschool children in Burkina Faso indicated no protective effect on P. falciparum [222]. However, the active daily case detection and rapid treatment, coupled with high levels of asymptomatic parasitemia, may have masked an effect of zinc on malaria attacks, particularly more severe ones. As such, these results are difficult to compare with those from the clinic-based surveillance in the trials from Papua New Guinea and the Gambia. A subsequent randomized controlled trial in the Peruvian Amazon of daily zinc and iron supplementation indicated that zinc, both alone and with iron, decreased P. vivax attacks more than 50% among children younger than 5 years [208]. Most recently, a large-scale trial in Tanzania evaluated the impact of daily zinc supplementation on mortality in preschool children. The study indicated a tendency for zinc supplements to reduce malaria mortality [223].

Given the initial reports of the effects of routine zinc supplementation on malaria, a multicenter trial was carried out to examine if zinc would reduce malaria morbidity if given as an adjunct to treatment. The results of this clinic-based study indicated that although a 4-day course of zinc improved plasma zinc status, there was no impact on parasitological or clinical outcomes for uncomplicated P. falciparum [224]. It may be that the mechanism by which zinc affects malaria requires more than several days of supplementation to have an impact. If so, one would not expect to observe an impact on treatment of malaria.

While additional information is needed to document the geographic regions and conditions of malaria transmission in which zinc might be effective, evidence is growing for a protective role of zinc on malaria. As such, it has been estimated based on existing data that zinc deficiency contributes about 20% to the global burden of malaria morbidity and mortality [130]. This provides considerable motivation for continued assessment of the impact of zinc on malaria and to explore the potential of improving zinc status as part of the integrated efforts to reduce the global burden of malaria.

9.4.4.3 Vitamin A

Vitamin A is essential for normal immune function [225], and several studies suggested it could play a role in potentiating resistance to malaria. Early work in vitamin A-deficient ducks indicated that vitamin A deficiency exacerbated malaria, but it had little effect on malaria in chicks [226]. Further studies in vitamin A-deficient rats and mice showed an increased susceptibility to malaria that was readily reversed by supplementation [227, 228]. In addition, a genetic locus, which includes cellular retinol-binding protein 1, has been shown to modulate malaria mortality and parasitemia in mice [229]. Subsequent in vitro studies showed that addition of free retinol to P. falciparum cultures reduced parasite replication in two studies [230, 231], although this was not seen in another [232].

In humans, cross-sectional studies in preschool children and in adults have reported inverse associations between plasma vitamin A levels and P. falciparum parasitemia [233238]. However, this observation could be due to the acute-phase response following infection, which tends to lower plasma vitamin A levels. One study observed that low baseline vitamin A status was associated with increased risk of parasitemia, but confounding by age could not be excluded [239]. Subsequently, a substudy of a vitamin A trial in preschool children in Ghana reported no statistically significant effects of vitamin A on P. falciparum morbidity or mortality [240]. However, longitudinal surveillance of slide-confirmed malaria morbidity was not conducted [241]. Another trial of vitamin A supplementation in children reported no effect on malaria parasitemia [242]. In contrast, additional evidence for a role of vitamin A in malaria infection includes selective depletion of plasma-borne provitamin A carotenoids during acute malaria attacks [233, 243] and some indication of modest protection against malaria morbidity from consumption of carotenoid-rich red palm oil [244].

The most definitive study to date of the effects of vitamin A on malaria was completed in Papua New Guinea [245]. The study, a double-blind, placebo-controlled trial, indicated that vitamin A supplementation reduced the frequency of P. falciparum episodes by 30% (95% CI 14–43, p = .0013) among preschool children. At the end of the study, geometric mean parasite density was 36% lower in the vitamin A than the placebo group, and the proportion of children with spleen enlargement was reduced by 11%, although neither difference was significant. However, it was clear that children aged 12–36 months benefited most, having 35% (95% CI 14–50, p = .0023) fewer malaria attacks, 26% fewer enlarged spleens, and a 68% reduction in parasite density.

These data have subsequently been affirmed by reports indicating that vitamin A supplements could offset the adverse effect of malaria on childhood growth [242], and that vitamin A supplements may protect pregnant women against malaria [246]. There is also preliminary evidence that vitamin A may reduce the severity of malaria when given as an adjunct to treatment [247]. While the mechanism underlying the effects of vitamin A on malaria are not fully characterized, one study indicated that vitamin A may help upregulate CD36 expression and facilitate phagocytosis of parasitized erythrocytes as well as limit pathological processes related to cerebral malaria [248].

Overall, the evidence indicates that vitamin A is important in resistance to malaria in humans, and quantitative estimates of disease burden indicated that vitamin A deficiency may account for about 20% of global malaria morbidity and mortality [130]. As for zinc, this provides impetus to improve vitamin A status as part of overall efforts to reduce the disease burden of malaria.

9.4.4.4 B Vitamins

9.4.4.4.1 Folate.

Folate is a crucial nutrient for cellular growth, including cell-mediated immunity, and DNA and protein synthesis. It is crucial for erythrocyte production. As such, folate supplements are often given in conjunction with iron to treat or prevent anemia. For this reason, and for its role in preventing neural tube defects, folate supplementation along with iron is part of routine prenatal care in most countries. Given that folate metabolism of the parasite is also a target for several antimalarial drugs, such as SP, the interactions between host folate status and malaria are of interest. Indeed, the observation that some parasite strains can access exogenous folate, thereby bypassing drug-impaired folate synthesis, places greater importance on host folate status [249].

Initial studies indicated that folate deficiency enhanced susceptibility to avian malaria [250]. In contrast, primate malaria species were unable to survive in severely folate-deficient rhesus monkeys [251]. This protective effect of folate deficiency differs from observations in humans; low infection rates have been reported in pregnant women consuming a diet high in folates [252]. In addition, greater infection rates have been reported in those suffering from megaloblastic anemia [253], a sign of folate deficiency. However, these findings may be influenced by the observation that malaria itself may induce folate deficiency [253255], and that folate utilization is dysregulated in infected erythrocytes [256].

A trial of prophylactic folate supplementation in preschool children in the Gambia [257] showed no adverse effects for malaria. A trial of folate supplements in pregnant women [258] showed no adverse effect on parasitemia, even though reticulocyte counts did increase. Two separate trials reported that development of P. falciparum in vivo was not affected by folate supplements given with pyrimethamine [258, 259]. In one case, the folate dose given was sufficient to reverse the side effects of high-dose pyrimethamine [260]. These studies led to the notion that folate supplementation affected neither malaria disease outcome nor the effect of drug treatment with pyrimethamine. Indeed, the routine use of folate supplements in malarious areas has been advocated [253, 261].

However, in the mid-1990s a randomized controlled trial from Africa indicated greater treatment failure for SP when folate supplements were given [262]. This finding, particularly the delayed clearance of parasitemia, was confirmed in two trials in Kenya and Zambia [263, 264] in which folic acid was given along with SP. However, it has also been demonstrated that intermittent prophylactic therapy with SP for pregnant women was not adversely affected by consumption of folic acid supplements [265].

As a whole, these data indicate that routine use of folic acid supplements in malarious areas may be contraindicated if SP is the primary treatment regimen, and that therapeutic supplementation with folic acid to treat malaria-related anemia may best be delayed until parasite clearance is achieved. Formal review of the existing evidence and subsequent recommendations are warranted.

9.4.4.4.2 Riboflavin.

Riboflavin status also influences malaria morbidity. The relationship appears to be one of antagonism such that deficiency confers a degree of protection. Reports from Papua New Guinea [266, 267] indicated that riboflavin-deficient infants were less likely to be infected. Similar observations were made in India [268, 269] and the Gambia [270]. In India, clinical malaria was also less severe in riboflavin-deficient individuals [271]. However, a more recent study in Gabon using high-performance liquid chromatography to assess riboflavin status, rather than the traditional erythrocyte glutathione reductase assay, did not find an association between riboflavin status and parasitemia [272]. A study in which both methods are used would be useful, as would assessment of the effects of riboflavin on malaria in the context of a randomized trial.

Because riboflavin is an essential factor for the enzyme glutathione peroxidase, an antioxidative enzyme, it has been proposed that deficiency promotes an oxidative environment that leads to destruction of the parasite. Indeed, lipid peroxidation was increased in riboflavin-deficient children with malaria infection [273], and reduced glutathione peroxidase activity was observed in red cells from riboflavin-deficient infected individuals [274]. Consistent with this notion is the observation that reduced glutathione activity persists in some populations residing in malarious areas despite adequate riboflavin intake [275], suggesting that isoforms with reduced activity confer resistance to malaria.

There is evidence for other mechanisms as well. Plasmodium falciparum-infected erythrocytes have an increased requirement for riboflavin [276]. Moreover, riboflavin analogs inhibit the growth of parasites in vitro [277, 278] and in vivo in experimental murine malaria [278]. In some cases, these activities also correlated with reduced activity of glutathione reductase [277]. Riboflavin-deficient rats are also more resistant to malaria [279]. However, riboflavin-deficient chicks are more susceptible [280]. Interestingly, additional work in rats suggested that the protective effect may be mediated by mechanisms that do not involve increased susceptibility of erythrocytes to oxidative damage, hemolysis, or erythropoiesis [281].

Although human and animal data suggest riboflavin deficiency is protective, in vitro studies suggested that high doses of riboflavin suppress parasite growth by preventing the oxidation of hemoglobin needed by the parasite [282]. Thus, high-dose riboflavin therapy could possibly be of benefit. Interestingly, the protective or exacerbative effects of riboflavin are based on different sites of action for the same antioxidant properties. This again emphasizes the complex pathways through which nutrients may influence malaria parasites and host morbidity.

9.4.4.4.3 Thiamine.

Reports from Thailand indicated that poor thiamine status is associated with greater risk of severe malaria and simple clinical malaria [283]. This is consistent with early experiments in which thiamine-deficient ducks were more susceptible to avian malaria [284]. There are also reports that acute cerebral ataxia following malaria can be treated with thiamine [285].

It is also of interest that the malaria parasite has been shown to synthesize thiamine [286] from proximal precursors. The significance of this vis-à-vis host thiamine status remains to be studied. In addition, it has been demonstrated that another B vitamin, pyridoxine [287], can be synthesized by P. falciparum. These pathways present potential new targets for the development of novel chemotherapeutics against malaria.

9.4.4.5 Vitamin E and Other Antioxidants

Several reports indicated that deficiencies of vitamin E and other antioxidants tend to protect against malaria infection [288]. As discussed, the absence of antioxidants makes the parasite more vulnerable to damage by oxygen radical produced by the immune system. In humans, it was initially proposed that the exacerbative effects on cerebral malaria following refeeding of famine victims [83] with grain was caused by the vitamin E content of the grain that would be absent in the diet of those who received milk [289, 290].

The exacerbative effect of vitamin E on malaria was first described by Godfrey in 1957, who demonstrated that the antimalaria effects of cod-liver oil in mice were reversible by giving vitamin E [291]. Multiple studies in rodent systems confirmed the protective effects of vitamin E deficiency [292296] and the ability of vitamin E to abrogate the protective effects of prooxidant compounds, such as peroxidizable fatty acids, on malaria [293296]. Interesting, however, was the observation that vitamin E deficiency was also protective against murine cerebral malaria [297], in which oxidative damage plays a significant role. Studies of avian malaria in the duck also observed a protective effect of vitamin E deficiency [298].

With regard to selenium, there are no human studies addressing the role of selenium in malaria. A few animal studies have been published and indicated that selenium has little role in modulating rodent malaria [299, 300]. However, selenium-deficient ducks were more susceptible to avian malaria [298].

Vitamin C also has been studied in animals, but little has been done in humans. Experiments in monkeys indicated that vitamin C deficiency exacerbated malaria [301]. In mice, however, results have been mixed. Godfrey [291] indicated that large doses of vitamin C, as with vitamin E, could abrogate the protective effect of cod-liver oil. This was not the case, however, when lower doses were used in conjunction with vitamin E-deficient mice [288], and vitamin C supplements did not modify the course of parasitemia in normal mice.

These data indicated that, although antioxidant vitamins may have an exacerbative role under some conditions, it is difficult to predict the effect of a nutrient on malaria based on its antioxidant properties alone. For example, a study from Uganda indicated that the antioxidant lycopene was associated with more rapid parasite clearance following treatment [302]. In general, data are lacking for the effects of antioxidant nutrients on malaria morbidity, pathology, and mortality in humans. Additional studies in this area are clearly warranted.

9.5 Conclusions and Recommendations

Malaria remains a very significant public health problem throughout the tropical world. The future success of malaria control lies in the ability to implement multiple, effective interventions that are technologically and economically sustainable. The current focus on early detection and treatment, insecticide-treated bed nets, and vector control through environmental management provides useful tools. There is also reason for optimism that a malaria vaccine with at least partial efficacy will be available for endemic areas within the coming decade.

This chapter elucidated the strong role that nutrition plays in modulating malaria morbidity and mortality and the potential that nutrient-based interventions might have in combating malaria. Indeed, given the clearly deleterious effects of poor nutritional status on malaria mortality, general improvements in dietary intake through improved childhood nutrition and economic development are likely to have a very large impact on reducing the disease burden of malaria. The observation that selective nutrient supplementation with vitamin A or zinc can substantially lower malaria attack rates suggests that targeted nutrient-based interventions can serve as useful adjuncts to malaria-control programs. At US $0.12 for a 1-year supply [303], vitamin A supplementation would rank among the more cost-effective interventions for malaria [304]. Moreover, both vitamin A and zinc supplementation have been demonstrated in several settings to reduce substantially morbidity from other infectious diseases [305307].

The effects of other nutrients also require examination, as indicated by reported predisposition to severe and uncomplicated malaria owing to thiamine deficiency, carotenoids, unsaturated fatty acids, as well as others. In general, low cost, high safety, and potential efficacy of targeted nutritional supplementation or fortification suggest that a rational approach to development of such interventions might prove useful for prevention or as adjunctive therapy for P. falciparum malaria. Other benefits may also be gained. For example, nutrient supplementation may mitigate the delay in acquired immunity associated with bed nets [308] and chemoprophylaxis [180].

Although there is reason for concern over an exacerbative effect of some nutrients on malaria morbidity, such as iron and possibly folate and certain antioxidants, this should also be considered a challenge to scientists and public health professionals to discover and invent novel approaches to obtain the benefits of key nutrients while mitigating any adverse effects. Integrated approaches to malaria control and nutritional improvement should be adopted.

In addition, more detailed investigations should be undertaken to clarify the effects of specific micronutrients, as well as macronutrients, on malaria morbidity, pathology, immunity, and mortality. Although these should focus primarily on P. falciparum in Africa, other geographic areas and species should not be neglected. For example, it is likely that investigations of the role of nutrition in P. vivax malaria in Asia and Latin America will also prove informative. Specifically, there is a need for well-designed longitudinal and clinic-based studies to determine the mechanistic basis of how nutrients influence malaria. Additional public health issues include examination of different nutritional requirements for adults and children with respect to malaria and the specific physiology of malnutrition and malaria in severely malnourished individuals such as those encountered under famine conditions. It would also be useful to understand how malaria affects dietary intake, dietary patterns, and food beliefs surrounding malaria illness.

Last, nutritional modulation of malaria morbidity and mortality highlights the complex nature of resistance to malaria. It is clear that different nutrients, such as vitamin A, zinc, and iron, selectively modify different aspects of malaria immunity and pathology. Study of these effects and their underlying mechanisms may yield important insight to host—parasite interactions, possibly leading to new therapies or vaccines.