Advertisement
Review Series Free access | 10.1172/JCI21682
1Faculty of Tropical Medicine, Mahidol University, Bangkok, Thailand. 2Centre for Vaccinology and Tropical Medicine, Churchill Hospital, Oxford, United Kingdom.
Address correspondence to: Nicholas J. White, Faculty of Tropical Medicine, Mahidol University, 420/6 Rajvithi Road, Bangkok 10400, Thailand. Phone: 662-354-9172; Fax: 662-354-9169; E-mail: nickw@tropmedres.ac.
Find articles by White, N. in: JCI | PubMed | Google Scholar
Published April 15, 2004 - More info
Malaria, the most prevalent and most pernicious parasitic disease of humans, is estimated to kill between one and two million people, mainly children, each year. Resistance has emerged to all classes of antimalarial drugs except the artemisinins and is responsible for a recent increase in malaria-related mortality, particularly in Africa. The de novo emergence of resistance can be prevented by the use of antimalarial drug combinations. Artemisinin-derivative combinations are particularly effective, since they act rapidly and are well tolerated and highly effective. Widespread use of these drugs could roll back malaria.
Malaria is a hematoprotozoan parasitic infection transmitted by certain species of anopheline mosquitoes (Figure 1). Four species of plasmodium commonly infect humans, but one, Plasmodium falciparum, accounts for the majority of instances of morbidity and mortality. There has been a resurgence of interest in malaria in recent years as the immensity of the burden it imposes on poor countries in the tropics has become apparent, and as efforts at control have foundered after the failure of the global eradication campaign in the 1960s. Control has traditionally relied on two arms: control of the anopheline mosquito vector through removal of breeding sites, use of insecticides, and prevention of contact with humans (via the use of screens and bed nets, particularly ones that are impregnated with insecticides); and effective case management. A long-hoped-for third arm, an effective malaria vaccine, has not materialized and is not expected for another decade. Case management has relied largely on antimalarials (mainly chloroquine, and more recently sulfadoxine-pyrimethamine [SP]), which are inexpensive and widely available and are eliminated slowly from the body. Together with antipyretics, antimalarials are among the most commonly used medications in tropical areas of the world. Misuse is widespread. In many parts of the tropics, the majority of the population has detectable concentrations of chloroquine in the blood. The extensive deployment of these antimalarial drugs, in the past fifty years, has provided a tremendous selection pressure on human malaria parasites to evolve mechanisms of resistance (Table 1). The emergence of resistance, particularly in P. falciparum, has been a major contributor to the global resurgence of malaria in the last three decades (1). Resistance is the most likely explanation for a doubling of malaria-attributable child mortality in eastern and southern Africa (2).
P. falciparum is now highly resistant to chloroquine in most malaria-affected areas. Resistance to SP is also widespread and has developed much more rapidly. Resistance to mefloquine is confined only to those areas where it has been used widely (Thailand, Cambodia, and Vietnam) but has arisen within six years of systematic deployment (3). The epidemiology of resistance in Plasmodium vivax is less well studied; chloroquine resistance is serious only in parts of Indonesia, Papua New Guinea, and adjacent areas. SP resistance in P. vivax is more widespread.
Unfortunately, most malaria-affected countries have less than $10 per capita annually to spend on all aspects of health, and so for a disease that is one of the most common causes of fever, a treatment cost of more than 50 cents becomes prohibitive. As a result, the nationally recommended treatment in most countries is antimalarial drugs (i.e., chloroquine or SP), which are partially or completely ineffective. The effects of resistance on morbidity and mortality are usually underestimated (4, 5). Predicting the emergence and spread of resistance to current antimalarials and newly introduced compounds is necessary for planning malaria control and instituting strategies that might delay the emergence of resistance (6). Resistance has already developed to all the antimalarial drug classes with one notable exception — the artemisinins. These drugs are already an essential component of treatments for multidrug-resistant falciparum malaria (7). If we lose artemisinins to resistance, we may be faced with untreatable malaria. In this review, the emergence of resistance to current antimalarial drugs is considered in two parts: first, the initial genetic event that produces the resistant mutant, and second, the subsequent selection process in which the survival advantage in the presence of the antimalarial drug leads to preferential transmission and the spread of resistance (8).
Most symptomatic malaria infections are uncomplicated and manifest as fever, chills, malaise, often abdominal discomfort, and mild anemia. In falciparum malaria, the mortality associated with this presentation is approximately 0.1%, if effective drugs are readily available. In a small proportion of P. falciparum infections, untrammeled parasite multiplication leads to heavy parasite burdens, which produce vital-organ dysfunction with impairment of consciousness, acidosis, and more severe anemia. Seizures, hypoglycemia, and severe anemia are common manifestations of severe malaria in children, whereas jaundice, pulmonary edema, and acute renal failure are more common in adults. The mortality despite treatment rises to 15–20%. As death in severe malaria usually occurs within 48 hours of presentation, i.e., one asexual cycle of the blood-stage infection, it is mainly the current generation of P. falciparum malaria parasites (i.e., those parasites present when the patient presents to medical attention) that will determine whether the patient lives or dies, and so prevention of their maturation from the less pathogenic circulating ring stages (0–16 hours) to the more pathogenic sequestered stages is important. Stage specificity of drug action is therefore an important consideration. But in uncomplicated malaria, inhibition of parasite multiplication has greater importance, as this prevents the progression to severe disease and leads to resolution of fever and other symptoms. Inhibition of parasite multiplication is a first-order process, which leads to a log-linear reduction in parasite numbers with time (9). Uninhibited blood-stage multiplication at 100% efficiency results in a parasite multiplication rate (PMR) equal to the median number of viable merozoites liberated by rupturing schizonts (5). In vivo efficiencies may exceed 50% in nonimmune patients, resulting in PMRs of approximately 10 per asexual cycle (10). Antimalarial drugs convert this positive value to a negative value, resulting in PMRs that range between 10–1 and 10–4 per cycle. These negative PMRs are also termed parasite killing rates or parasite reduction ratios (11). The higher values (i.e., lower killing rates) are obtained after therapy with drugs with relatively weak antimalarial activity, such as tetracyclines, and the highest values are obtained with artemisinin derivatives (Figures 2 and 3). Drug resistance to an anti-infective compound is defined by a right shift in the concentration-effect (dose-response) relationship (Figure 4). For uncomplicated malaria this refers to prevention of multiplication, and so for any given free plasma concentration of antimalarial drug there is less inhibition of parasite multiplication as resistance increases.
Pharmacodynamics: the parasite reductions produced by the different antimalarial drugs in vivo (in an adult patient with 2% parasitemia). Parasite reduction ratios (PRR; fractional reduction per asexual cycle) vary from less than 10 (antibiotics with antimalarial activity, antimalarials for which resistance is high grade) to 10,000 (artemisinin derivatives). Antimalarial drugs must be present at levels greater than the minimum inhibitory concentration (MIC) until eradication of the infection in nonimmune patients to ensure cure of the infection. Adapted with permission from from Trends in Parasitology (60).
Pharmacokinetic properties of the generally available antimalarial drugs. The origin represents the maximum concentration (100%) achieved after a therapeutic dose. A, artemisinins; Q, quinine; P, pyrimethamine; C, chloroquine; M, mefloquine. Adapted with permission from from Trends in Parasitology (60).
The dose-response curve in malaria. Increasing drug resistance leads to a rightward shift in the dose-response or concentration-effect relationship. The principal effect in uncomplicated malaria is parasite killing. This shift can be parallel, or the shape of the curve and the maximum effect can change. Adapted with permission from Trends in Parasitology (60).
The genetic events that confer antimalarial drug resistance (while retaining parasite viability) are spontaneous and rare and are thought to be independent of the drug used. They are mutations in or changes in the copy number of genes encoding or relating to the drug’s parasite target or influx/efflux pumps that affect intraparasitic concentrations of the drug (Table 1). A single genetic event may be all that is required, or multiple unlinked events may be necessary (epistasis). As the probability of multigenic resistance arising is the product of the individual component probabilities, this is a significantly rarer event. P. falciparum parasites from Southeast Asia have been shown to have an increased propensity to develop drug resistance (12).
Chloroquine resistance in P. falciparum may be multigenic and is initially conferred by mutations in a gene encoding a transporter (PfCRT) (13). In the presence of PfCRT mutations, mutations in a second transporter (PfMDR1) modulate the level of resistance in vitro, but the role of PfMDR1 mutations in determining the therapeutic response following chloroquine treatment remains unclear (13). At least one other as-yet unidentified gene is thought to be involved. Resistance to chloroquine in P. falciparum has arisen spontaneously less than ten times in the past fifty years (14). This suggests that the per-parasite probability of developing resistance de novo is on the order of 1 in 1020 parasite multiplications. The single point mutations in the gene encoding cytochrome b (cytB), which confer atovaquone resistance, or in the gene encoding dihydrofolate reductase (dhfr), which confer pyrimethamine resistance, have a per-parasite probability of arising de novo of approximately 1 in 1012 parasite multiplications (5). To put this in context, an adult with approximately 2% parasitemia has 1012 parasites in his or her body. But in the laboratory, much higher mutation rates thane 1 in every 1012 are recorded (12).
Mutations may be associated with fitness disadvantages (i.e., in the absence of the drug they are less fit and multiply less well than their drug-sensitive counterparts). Another factor that may explain the discrepancy between in vitro and much lower apparent in vivo rates of spontaneous mutation is host immunity. Even a previously nonimmune individual develops a specific immune response to a malaria infection. This response is systematically evaded by the parasite population through programmed antigenic variation of the main red cell surface–expressed epitopes. In falciparum malaria, P. falciparum erythrocyte membrane protein 1 (PfEMP1), which is encoded by the var multigene family, changes in 2–3% of parasites each asexual cycle (15). The untreated infection is characterized by successive waves of parasites, each comprising largely one antigenically distinct surface phenotype. It is likely that this specific immune response directed against the immunodominant surface antigens will reduce the probability of the usually single mutant parasite ever multiplying sufficiently to transmit as for P. falciparum; there is only a 2–3% chance that the genetic event causing resistance would arise in the antigenically variant subpopulation that will expand to reach transmissible densities.
The cause of chloroquine resistance in P. vivax has not been found. Resistance to mefloquine and other structurally related arylaminoalcohols in P. falciparum results from amplifications (i.e., duplications, not mutations) in Pfmdr, which encodes an energy-demanding p-glycoprotein pump (Pgh) (16–19). This is a more common genetic event. It is tempting to speculate that the relatively poor fidelity in mitotic duplication of this sequence has evolved to allow parasite populations to respond to environmental stresses, such as alterations in human diet. But the gene amplifications may well confer a fitness disadvantage to the parasite once the population stress has passed. The consequences of these various genetic events are reduced intracellular concentrations of the antimalarial quinolines (the relative importance of reduced uptake and increased efflux remains unresolved). All these drugs interfere with the parasites’ ability to detoxify heme liberated from hemoglobin.
For P. falciparum and P. vivax, resistance to antifols (pyrimethamine and cycloguanil) results from the sequential acquisition of mutations in dhfr (13). Each mutation confers a stepwise reduction in susceptibility. Resistance to the sulfonamides and sulfones, which are often administered in synergistic combination with antifols, also results from sequential acquisition of mutations in the gene dhps, which encodes the target enzyme dihydropteroate synthase (20). Resistance to atovaquone results from point mutations in the gene cytB, coding for cytochrome b. Atovaquone is deployed only in a fixed combination with proguanil (chloroguanide). In this combination, it is proguanil itself acting on the mitochondrial membrane, rather than the dhfr-inhibiting proguanil metabolite cycloguanil, that appears to be the important actor. Whether and how resistance develops to proguanil’s mitochondrial action are not known (21). Although the target for the artemisinins has recently been identified (PfATPase6) (22), preliminary studies have not so far associated polymorphisms in the gene encoding this enzyme with reduced susceptibility to artemisinins (18).
Assuming an equal distribution of probabilities of spontaneous occurrence throughout the malaria parasites’ life cycle, the genetic event resulting in resistance is likely to take place in only a single parasite at the peak of infection. These genetic events may result in moderate changes in drug susceptibility, such that the drug still remains effective (e.g., the serine-to-asparagine mutation at position 108 in Pfdhfr that confers pyrimethamine resistance), or, less commonly, very large reductions in susceptibility, such that achievable concentrations of the drug are completely ineffective (e.g., the mutations in cytB that confer atovaquone resistance) (16, 21, 23). It had been thought that resistance to some antimalarial compounds (notably pyrimethamine and SP) in human malaria parasites emerged relatively frequently. This suggested that prevention of the emergence of resistance would be very difficult, and control efforts would be better directed at limiting the subsequent spread of resistance. Recent remarkable molecular epidemiological studies in South America, southern Africa, and Southeast Asia have challenged this view. By examination of the sequence of the regions flanking the Pfdhfr gene, it has become apparent that, even for SP, multiple de novo emergence of resistance has not been a frequent event, and that, instead, a single parasite (with a mutation in Pfdhfr at positions 51, 59, and 108) has in recent years swept across each of these continents (24–26). The ability of these resistant organisms to spread has been phenomenal and may well relate to the apparent stimulation of gametocytogenesis that characterizes poor therapeutic responses to SP (27). Gametocyte carriage is considerably augmented following SP treatment of resistant infections. Studies to date do not suggest reduced infectivity for these gametocytes. There is a sigmoid relationship between gametocyte densities in blood and infectivity, which in volunteer studies was shown to saturate at gametocyte densities above 1,000 per microliter (a relatively high density in field observations). Thus it is the relative transmission advantage conferred by increased gametocyte carriage that drives the spread of resistance (5, 8).
In experimental animal models, drug resistance mutations can be selected for, without mosquito passage (i.e., without meiotic recombination), by exposure of large numbers of malaria parasites (in vitro, in animals, or, in the past, in volunteers) to subtherapeutic antimalarial drug concentrations (28).
In order to assess the factors determining the emergence and spread of resistance, we need to consider the numbers of malaria parasites likely to be exposed to the drugs, both within an individual and in the entire human population. Fortunately this estimate of parasite numbers is much more precise than for almost any other human pathogen. Malaria parasites are eukaryotes. Meiosis occurs after a female anopheline mosquito has taken viable gametocytes in its blood meal. All the other 108–1013 cell divisions in the life cycle are mitotic. Nearly all these divisions take place in the bloodstream of the human host. Usually, less than ten sporozoite parasites are inoculated by an infected mosquito in order to establish malaria infection (29, 30) (Figure 1). These rapidly find their way to the liver. During P. falciparum infection, each infected hepatocyte liberates approximately 30,000 merozoites after 5–6 days of pre-erythrocytic schizogony. Thus approximately 100,000–300,000 merozoites are liberated into the bloodstream to begin the 48-hour asexual reproduction cycle. This is an important number, as it is the number of parasites that would encounter residual drug levels from a previous antimalarial treatment or drug levels during chemoprophylaxis (see below) (8). The density of parasites in the blood at which symptoms and fever occur (the pyrogenic density), and thus the stage at which appropriate antimalarial treatment could be given, vary considerably (31–33). In nonimmune people, nonspecific symptoms often occur a day or two before parasites are detectable on the blood smear (about 50 parasites per microliter of blood). This density corresponds to a total of between 108 and 109 asexual parasites in an adult with a red cell volume of about 2 l. In areas of moderate- or high-intensity transmission, parasitemias considerably higher than this level may be tolerated without symptoms, although densities over 10,000 per microliter (between 1010 and 1011 parasites in the body of an adult, and correspondingly less in children) are usually symptomatic, even in very high-transmission settings (34). Median or geometric mean parasite counts in malariometric surveys are usually below this value (i.e., most people with detectable parasitemias in these endemic areas are not obviously ill). It is estimated that approximately 300 million people in the world now have malaria parasites in their blood. Using current epidemiological data we have estimated that there must be less than 3 × 1016 malaria parasites in the world’s asymptomatic carriers (Figure 5) (8).
Total numbers of malaria parasites (log scale), from inoculation by an anopheline mosquito, through the development of infection in the human host, to the total estimated in the world today.
Geometric mean or median admission parasitemias in clinical studies of falciparum malaria usually lie between 5,000 and 50,000 per microliter, with the lower figure coming from low-transmission settings, and the higher figure from high-transmission settings. Thus, if between one million and ten million people are symptomatic in any 2-day period (i.e., 180 million to 1800 million symptomatic infections per year), then, based on their age distribution and their blood volumes and parasitemias, these ill people would contain between 5 × 1016 and 5 × 1017 malaria parasites (8).
Thus, on any day, although the majority of people infected with malaria are asymptomatic, a significant proportion, and probably the majority, of malaria parasites in the world are in people who are ill. It has been argued that, if the probability of a de novo resistance mutation arising is distributed evenly among all these parasites, then, because of their logarithmic distribution, those patients with high parasitemias who survive their infection to transmit viable gametocytes carry a significant proportion of all the world’s “potentially transmissible” malaria parasites (8). They must therefore be an important potential source of resistance. Although mortality is increased in hyperparasitemic infections, thereby stopping transmission, if the patient survives, the chance of selection and preferential survival of such a drug-resistant mutant from this patient is greater than if a similar mutant arose in a person with much lower parasitemia. This is because a hyperparasitemic patient has limited immunity, particularly against the “strain” causing the infection (otherwise a high parasitemia would not have developed). The immune response kills parasites irrespective of their sensitivity to antimalarial drugs. A hyperparasitemic patient will receive antimalarial drug treatment, whereas nearly all the asymptomatic patients will not, and the hyperparasitemic patient may also be seriously ill, resulting in vomiting and malabsorption of antimalarial treatment. High parasite counts are associated with a higher chance of treatment failure than infections with lower parasite numbers (35).
In the emergence and spread of resistance to antimalarial drugs, there are many parallels with antibiotic resistance (36, 37) — particularly antituberculous drug resistance, where, as for malaria, transferable resistance genes are not involved in the emergence of resistance. Resistance to one drug may be selected for by another drug in which the mechanism of resistance is similar (a phenomenon known as cross-resistance). Antimalarial resistance in malaria parasites spreads because it confers a survival advantage in the presence of the antimalarial and therefore results in a greater probability of transmission for resistant than for sensitive parasites. Resistant infections are more likely to recrudesce, and eventually, as resistance worsens, infections with resistant parasites respond more slowly to treatment. Both increased rates of recrudescence and slow initial responses to treatment increase the likelihood of generating sufficient gametocyte densities to transmit, compared with drug-sensitive infections. Mathematically, it is this ratio of transmission probabilities in drug-resistant compared with drug-sensitive infections that drives the spread of resistance. The recrudescence and subsequent transmission of an infection that generated resistant malaria parasites de novo are essential for resistance to be propagated (5). If resistance is low grade (i.e., a small shift in the concentration-effect relationship), or combination treatment is given that is highly effective, then resistance may confer only a very small increase in the treatment failure rate, and a correspondingly slow rate of spread. As resistance worsens, failure rates rise, and the rate of spread accelerates. In the rare but important infection in which resistance arises de novo, killing of the transmissible sexual stages (gametocytes) during the primary infection does not affect resistance, because these gametocytes derive from drug-sensitive parasites. Gametocytes carrying the resistance genes will not reach transmissible densities until the resistant biomass has expanded to a population size close to that necessary to produce illness (>107 parasites) (38). Thus, to prevent spread of resistance, gametocyte production from the subsequent recrudescent-resistant infection must be prevented.
Immunity to malaria is acquired slowly and imperfectly. A state of sterile immunity against all infections is never attained. Malaria parasites, like other successful parasites, have developed sophisticated immune-evasion strategies. In low-transmission areas, where infections are acquired infrequently (e.g., less than three times a year), the majority of malaria infections are symptomatic and selection of resistance therefore takes place in the context of specific antimalarial treatment. Relatively large numbers of parasites in an individual encounter antimalarial drugs. In higher-transmission areas the majority of infections are asymptomatic and these are acquired repeatedly throughout life. Symptomatic and sometimes fatal disease occurs in the first years of life, but thereafter malaria becomes increasingly likely to be asymptomatic. In areas of higher malaria transmission, people still receive antimalarial treatments throughout their lives, as the term malaria is often used to describe any type of fever, and most treatment is given empirically without microscopy or dipstick confirmation. But these inappropriate treatments for other febrile illnesses are largely unrelated to the peaks of parasitemia, which reduces the individual probability of resistance selection (8). Host defense mechanisms contribute a major antiparasitic effect, with which any spontaneously generated drug-resistant mutant malaria parasite must contend. This reduces significantly the survival probability of individual malaria parasites. Even if the resistant mutant does survive the initial drug treatment and multiplies, the chance that this will result in sufficient gametocytes for transmission is reduced as a result of both asexual-stage immunity (which reduces the multiplication rate and lowers the density at which the infection is controlled) and specific antigametocyte (transmission-blocking) immunity. Furthermore, other parasite genotypes are likely to be present, since infections are acquired continuously. These compete with the resistant parasites for red cells and increase the possibility of outbreeding of multigenic-resistance mechanisms or competition in the feeding anopheline mosquito. These factors, which reduce the probability of selecting for and transmitting resistance in high transmission areas, are balanced against the increased frequency of vector biting, and thus the increased probability that a feeding anopheline will encounter the resistance-bearing gametocytes. In some areas of the tropics, new malaria infections are acquired more than once each day. Even if the resistance-bearing parasites do establish themselves in the anopheline mosquito, they must still be transmitted to a susceptible recipient for resistance to spread. In areas where the majority of the population is immune, the individual probability of propagation is reduced, as inoculation in a subsequent mosquito-feeding event often does not result in an infection capable of being transmitted (i.e., an infection generating sufficient gametocytes for onward transmission).
In high-transmission areas, where malaria-associated illness and death are largely confined to young children, the chance of a drug encountering large numbers of parasites in a semi-immune host is confined to the first few years of life. The net result is considerable reduction in the probability of de novo selection and subsequent transmission of a resistant parasite mutant in high-transmission compared with low-transmission areas. Historically, chloroquine resistance emerged in low-transmission areas, and antifol resistance has increased more rapidly in low-transmission than in high-transmission areas.
The control of malaria infections is impaired in pregnancy. In low-transmission settings, P. falciparum infections are more severe, but at all levels of transmission there is an associated reduction in birth weight of infants born to mothers with malaria (both with P. falciparum and with P. vivax infection). For P. falciparum the adverse effects are greatest in primigravidae (39). The placenta is a site of P. falciparum sequestration and appears to be a “privileged” site for parasite multiplication, although exactly how this local immune paresis to malaria parasites operates is unclear. This has implications for the greater emergence and spread of resistance, which have not been evaluated. Responses to antimalarial drug treatment regimes in low-transmission settings are always worse in pregnant women compared with age-matched nonpregnant women from the same location (40). Treatment failures drive the development of resistance. The placenta may contain large numbers of parasites, thereby increasing the selection probability. These parasites are usually of a single surface-antigen phenotype (they bind to chondroitin sulphate A, and hyaluronic acid), suggesting expression of a single conserved var gene (41). After establishment of the infection in a pregnant woman who has not had malaria in pregnancy before (usually a primigravida in an endemic area), the infecting parasites are not apparently selected by the immune response to surface-expressed antigens, and so if a drug-resistant mutation arises it does not need to arise in a variant subpopulation to ensure its survival. Other factors also favor the emergence and spread of resistance. Antimalarial drug pharmacokinetics are usually altered, often with an expanded apparent volume of distribution (quinine, mefloquine, atovaquone, and proguanil), resulting in lower drug levels for any given dose. There are even data suggesting that pregnant women are more attractive to mosquitoes (42). It is widely recommended that pregnant women receive antimalarial prophylaxis, but the only drugs considered safe are chloroquine, which is ineffective against P. falciparum nearly everywhere, and proguanil, to which widespread resistance exists, and which has reduced biotransformation to the active antifol metabolite cycloguanil. Prophylaxis for pregnant women has given way to intermittent presumptive treatment (IPT) with SP, in which a treatment dose is given two or three times during the pregnancy, although SP is falling to resistance. Since in IPT the antimalarial drug is usually administered to healthy women, the biomass of parasites confronted by the drug is less than that present in symptomatic infections, but how much less has not been investigated. Taken together, these observations suggest that pregnant women could be an important contributor to antimalarial drug resistance.
There is now increasing evidence that there is an interaction between falciparum malaria and HIV infection. In settings of high malaria transmission, malaria is largely a problem of childhood, whereas HIV has higher mortality rates in infants and adults. But with the increasing availability of antiretroviral drugs, HIV-infected patients will live longer, and so the two infections will coincide more often. HIV coinfection in pregnancy is associated with greater reduction in birth weight than that associated with malaria infection alone (43). IPT with SP must be given monthly in order to achieve the same improvements in birth weight as 8–12 weekly administrations in HIV-negative pregnant women. Compared with HIV-negative nonimmune patients, more severe malaria is seen in HIV-infected nonimmune patients, and severely immunocompromised HIV-infected patients in high-transmission settings have higher parasite densities (44, 45). This suggests that the immunosuppression associated with HIV infection can affect the control of malaria-parasite numbers and would therefore compromise the effect of antimalarial immunity in reducing the selection and spread of antimalarial drug resistance. Trimethoprim-sulfamethoxazole is widely given to patients with HIV/AIDS as prophylaxis against opportunistic infections. This antifol-sulfonamide combination is also antimalarial. Whether this promotes the emergence of antifol resistance or delays it (by reducing malaria attacks) is not known. The data are insufficient on these increasingly important problems.
Antimalarial resistance is selected by administration of concentrations of drug sufficient to inhibit multiplication of sensitive, but not of resistant, parasites. The parasites are present in the blood, and therefore it is the concentration of free (unbound) drug achieved in the plasma that is most therapeutically relevant. A number of behavioral, pharmaceutic, and pharmacokinetic factors affect the probability of parasites encountering subtherapeutic levels of antimalarial agents. Several antimalarial drugs (notably lumefantrine, halofantrine, atovaquone, and, to a lesser extent, mefloquine) are lipophilic, hydrophobic, and quite variably absorbed (interindividual variation in bioavailability varies up to 20-fold) (46, 47). There is also large interindividual variability in distribution volumes. Together these result in considerable interindividual variations in blood concentration profiles (48). Since doses are chosen based on the therapeutic ratio — which defines the difference between a therapeutically effective dose and a dose capable of inducing adverse effects — poor oral bioavailability with a consequent wide range in blood levels will favor the emergence of resistance. Improving oral bioavailability thus reduces doses required to clear infection (and thus reduces costs) and should reduce the emergence and spread of resistance.
All people living in a high-transmission area have some malaria parasites in their blood all the time, and each person harbors many different parasite genotypes (although many are at densities below the level of PCR detection). Antimalarial treatment for symptomatic malaria exposes not only the parasites causing that infection to the drug, but also any newly acquired infections that emerge from the liver during the drug’s elimination phase. The longer the terminal elimination half-life of the drug, the greater is the probability that any newly acquired parasite will encounter a partially effective (i.e., selective) drug concentration (49–51). The length of the terminal elimination half-life is therefore an important determinant of the propensity for an antimalarial drug to become ineffective because of the development of resistance, provided that the concentrations in the terminal phase traverse the steep part of the sigmoid concentration-effect relationship for the prevalent malaria parasites. This caveat is important since there has been a tendency to concentrate on the terminal-phase half-life as the main determinant of the rate of spread of resistance, but this is true only if this phase is “selective.” For example, as chloroquine resistance increases, the chloroquine terminal elimination phase increasingly encompasses ineffective drug concentrations; it therefore no longer selects for higher levels of resistance (Figure 6). Some antimalarial drugs, e.g., the artemisinin derivatives, are never presented to infecting malaria parasites at intermediate selective drug concentrations, because they are eliminated completely within the 2-day life cycle of the asexual parasite. Other drugs (e.g., mefloquine, piperaquine, and chloroquine) have elimination half-lives of weeks or months. The prolonged presence of these drugs in the host’s blood provides a lengthy exposure time in which resistant parasites may be selected. The probability of achieving a selective drug concentration in the plasma, and thus preferential survival of a resistant parasite during the elimination phase, depends on the degree of rightward shift in the concentration-effect relationship curve, its slope, and the duration of the elimination phase of the drug (5). The probability of subsequent transmission depends on the level of immunity; subsequent drug exposure; parasite multiplication capacity, which must take into account any fitness disadvantage conferred by the resistance mechanism; the reduction in antimalarial susceptibility, i.e., degree of resistance, conferred by the resistance mechanism; and intrahost competition from coexistent drug-sensitive parasites.
A slowly eliminated antimalarial such as chloroquine or piperaquine presents a lengthy opportunity for the selection of resistance among sensitive parasites (MIC A), but once resistance has become established (MIC B), the terminal elimination phase is no longer selective, because the blood concentrations are no longer inhibitory.
It has been suggested that the repeated exposure of parasite populations to residual drug concentrations of slowly eliminated drugs in areas of frequent infection is an important source of resistance (52). But this did not take into account the numbers of parasites involved (Figure 7). Selection of de novo resistance from an infection that emerges from the liver during the elimination phase of antimalarial treatment given to treat a previous infection (or during prophylaxis) would usually occur in the first generation of blood-stage malaria parasites — a total of approximately 105 parasites. This is because, for a resistant mutant to arise, and survive, from a larger number of parasites in generations subsequent to the first following hepatic schizogony, the antimalarial blood concentrations must have fallen below the minimum inhibitory concentration (MIC; the concentration associated with a multiplication rate of 1) for the sensitive parasites — otherwise their numbers would not have increased. If antimalarial concentrations exceed the sensitive parasites’ MIC, then total parasite numbers will fall, and the chance of resistance selection in subsequent generations will fall in parallel. Assuming an equal probability of mutations arising among blood-stage parasites, the probability of resistance arising during the first asexual cycle following emergence from liver (105 parasites) is therefore between 1,000 and 107 times lower than in a symptomatic infection. To put this in context, if an individual acquired 20 symptomatic and potentially transmissible infections per year for fifty years, then the de novo selection probability from residual drug exposure to newly acquired infections in that half-century would be 1% of that in a single symptomatic infection of 1012 parasites. Taken together, the balance of evidence strongly favors acute symptomatic infection as the source of de novo antimalarial resistance. But the long elimination phase of some of the antimalarial drugs does provide a very efficient selective filter for resistant infections acquired from elsewhere, as it allows resistant infections to develop and then spread but suppresses sensitive infections. This selectively amplifies resistance. Thus, although it is a very unlikely source of de novo resistance, the duration of the antimalarial drug’s elimination phase is an important determinant of the spread of antimalarial drug resistance (49–51). These calculations also suggest that antimalarial prophylaxis regimes, when adhered to, and mass treatment with effective drugs would not be major contributors to resistance (53), unlike mass continuous administration of subtherapeutic doses, as in table salt (the Pinotti method), which was disastrous (54).
Opportunities for the de novo selection of antimalarial drug resistance in an area of high transmission (entomological inoculation rate 50 per year; each inoculation is depicted as a green arrow) in a young child treated for acute falciparum malaria with a slowly eliminated drug such as mefloquine (red dotted line). The initial infection (infection 1) is eliminated. The next infection acquired (infection 2) is also eliminated. Infections 3 and 4 are suppressed temporarily but eventually reach detectable densities. Infections 5 and 6 are under no selection pressure and also reach detectable densities. The inset shows the pharmacodynamic events, the relationship between concentration (C) and effect (E). When mefloquine levels fall below the minimum parasiticidal concentration (MPC) giving maximum parasite killing (Emax), then the rate of decline in parasitemia (PRR) falls until the PRR reaches 1. This results from an MIC of mefloquine and occurs in infection 3. Thereafter, parasitemia rises again and becomes detectable nearly 6 weeks after initial treatment.
The theory underlying combination drug treatment of tuberculosis, leprosy, and HIV infection is well known and is now generally accepted for malaria (5, 8, 55–58). If two drugs are used with different modes of action, and therefore different resistance mechanisms, then the per-parasite probability of developing resistance to both drugs is the product of their individual per-parasite probabilities. This is particularly powerful in malaria, because there are only about 1017 malaria parasites in the entire world. For example, if the per-parasite probabilities of developing resistance to drug A and drug B are both 1 in 1012, then a simultaneously resistant mutant will arise spontaneously every 1 in 1024 parasites. As there is a cumulative total of less than 1020 malaria parasites in existence in one year, such a simultaneously resistant parasite would arise spontaneously roughly once every 10,000 years — provided the drugs always confronted the parasites in combination. Thus the lower the de novo per-parasite probability of developing resistance, the greater the delay in the emergence of resistance.
Stable, therapeutically significant resistance to the artemisinin derivatives has not yet been identified and cannot be induced yet in the laboratory, which suggests that it may be a very rare event. But it would be foolish to bank on its not happening, and should it arise, it would be a global disaster. For mutual protection against the emergence of drug resistance, these drugs should be used only in combination with other antimalarials.
Artemisinin derivatives are particularly effective in combinations because of their very high killing rates (parasite reduction ratios ∼10,000-fold per cycle), lack of adverse effects, and absence of significant resistance (11). The ideal pharmacokinetic properties for an antimalarial drug have been greatly debated. From a resistance-prevention perspective, the combination partners should have similar pharmacokinetic properties. Rapid elimination ensures that the residual concentrations do not provide a selective filter for resistant parasites, but these drugs (if used alone) must be given for 7 days, and adherence to 7-day regimens is poor. Even 7-day regimens of artemisinin derivatives are associated with approximately 10% failure rates. In order to be highly effective in a 3-day regimen, terminal elimination half-lives of at least one drug component need to exceed 24 hours. Combinations of artemisinin derivatives (which are eliminated very rapidly) given for 3 days, with a slowly eliminated drug such as mefloquine (artemisinin combination treatment) provide complete protection for the artemisinin derivatives from selection of a de novo resistant mutant if adherence is good (i.e., no parasite is exposed to artemisinin during one asexual cycle without mefloquine being present). But this does leave the slowly eliminated “tail” of mefloquine unprotected by the artemisinin derivative. The residual number of parasites exposed to mefloquine alone, following two asexual cycles, is a tiny fraction (less than 0.00001%) of those present at the peak of the acute symptomatic infection. Furthermore, these residual parasites are exposed to relatively high levels of mefloquine, and, even if susceptibility was reduced, these levels are usually sufficient to eradicate infection. The long “tail” of the mefloquine elimination phase does, however, provide a selective filter for resistant parasites acquired from elsewhere and, as described earlier, contributes to the spread of resistance once it has developed. Yet on the northwestern border of Thailand, an area of low transmission where mefloquine resistance had developed already, systematic deployment of artesunate-mefloquine combination therapy was dramatically effective, both in stopping resistance and also in reducing the incidence of malaria (3, 59). This strategy would be expected to be effective at preventing the de novo emergence of resistance at higher levels of transmission, where high-biomass infections still constitute the major source of de novo resistance.
The main obstacles to the success of combination treatment in preventing the emergence of resistance will be incomplete coverage, or inadequate treatment, and, as for antituberculous drugs, use of one of the combination partners alone. Drugs of poor quality are common in tropical areas of the world, adherence to antimalarial treatment regimens is often incomplete, and antimalarials are available widely in the market place. Resistance to the artemisinins may not have happened yet. If it does, it will most likely arise in a hyperparasitemic patient who received an inadequate dose of a single antimalarial drug, not in combination with another suitable antimalarial agent. Irrespective of the epidemiological setting, ensuring that patients with high parasitemias receive a full course of adequate doses of artemisinin combination treatment would be an effective method of slowing the emergence of antimalarial drug resistance.
Nicholas J. White is a Wellcome Trust Principal Fellow.
Nonstandard abbreviations used: intermittent presumptive treatment (IPT); minimum inhibitory concentration (MIC); parasite multiplication rate (PMR); sulfadoxine-pyrimethamine (SP).
Conflict of interest: The author has declared that no conflict of interest exists.
Germs, governance, and global public health in the wake of SARSDavid P. Fidler
Emerging infectious diseasesVincent R. Racaniello
Dengue: defining protective versus pathologic immunityAlan L. Rothman
Acute HIV revisited: new opportunities for treatment and preventionChristopher D. Pilcher et al.
West Nile virus: a growing concern?L. Hannah Gould et al.
The emergence of Lyme diseaseAllen C. Steere et al.
Antimalarial drug resistanceNicholas J. White