Rewriting Life
Winning the War Against Malaria
It’s been a losing battle for half a century. We have many of the weapons we need to save the lives of millions of children, but to turn the tide we will have to marshal all of them.
Every 12 seconds, a child dies of malaria. New initiatives to prevent and treat malaria could save the lives of one-fourth of these children by the turn of the century.
When Americans think of childhood diseases, we rarely think of malaria. Yet it is a leading cause of death among the world’s children. More than 2.5 million die of malaria each year, most of them in Africa. And those who survive chronic infection suffer a combination of anemia and immune suppression that leaves them vulnerable to other fatal illnesses.
Among adults living in areas of high transmission, malaria is best thought of as a chronic, debilitating illness that robs its victims of years of productivity. A single mosquito bite can transmit one of the four parasites that cause malaria, setting in motion bouts of fever, chills, and nausea that can recur for weeks. And in some areas, people receive as many as 300 infective bites per year. According to a 1993 World Bank report, malaria represents a global public health burden second only to tuberculosis among infectious diseases. In sub-Saharan Africa, where most cases of malaria and nearly all malaria-related deaths occur, more years of life are lost to malaria than to any other disease.
Despite massive efforts to eradicate the disease in the 1950s and early 1960s, there is more human malaria in the world today than at any other time in history. More than 500 million people are infected with malaria worldwide; one fourth of the world’s population is at risk for infection. And the risk is rising as environmental changes and large-scale migration bring people and mosquitoes together and as parasites develop resistance to successive generations of drugs.
Malaria has confounded some of the best minds of this century. A hundred years after the discovery that mosquitoes transmit malaria, we still do not know enough about the disease to defeat it permanently. But we do have the tools to limit its spread and dramatically reduce the rate at which children are dying. Our goals should be to reduce childhood mortality from malaria by at least one fourth before the turn of the century, by half in its first decade, and by more than 90 percent in its second decade. By reexamining both the successes and the failures of the past, we can develop a more effective, comprehensive public health strategy to contain and control this lethal opponent.
Learning from Failure
Renewed interest in malaria at home and abroad makes this a politically opportune time for new initiatives. Articles in both the popular press and scientific journals have called attention to the looming crisis posed by the disease. In January 1997, malaria experts from 35 countries and representatives from the major agencies that fund malaria research convened at an international conference to address the spread of the disease in Africa. And the World Health Assembly, the governing body of the World Health Organization (WHO), passed a resolution calling on member states to renew their political commitment to malaria control and to guarantee sufficient funding, staff, and other resources to sustain this effort.
This is not the first time that public and private agencies have geared up to assault the disease. In the 1950s, WHO, the United Nations International Children’s Emergency Fund (UNICEF), and the U.N. Food and Agriculture Organization enthusiastically declared that the time was right to eliminate malaria as a public health problem throughout the world. The malaria eradication programs they sponsored relied on a combination of prevention – spraying with the insecticide DDT – and early identification and treatment of infected individuals, deploying an arsenal of new antimalarial drugs, the best known of which was chloroquine. (Although WHO described this effort as a global eradication campaign, sub-Saharan Africa was not included in the early phases, probably because high transmission rates, the lack of administrative and financial resources, and the logistical problems of reaching rural areas were so daunting. Presumably, the plan was to include Africa after success had been demonstrated elsewhere.)
Despite its initial promise, the DDT campaign backfired. Programs in many malaria-endemic countries were unable to sustain the level of thoroughness and efficiency required to make residual insecticide spraying effective. The result was inadequate or erratic coverage. Mosquitoes that survived low doses of insecticide reproduced, creating populations of insecticide-resistant, malaria-carrying pests. In response to erratic spraying, mosquitoes simply changed their behavior – for instance, they stopped settling on the walls of houses that had been sprayed and moved to nearby vegetation that hadn’t.
Where the malaria eradication program worked, it soon became a victim of its own success. As the incidence of malaria became negligible in these areas, international organizations downgraded the disease as a priority health issue; at the national level, politicians and government agencies withdrew their support. The result was a dramatic resurgence of infection. In Sri Lanka, for instance, the incidence of malaria reached its lowest point in 1963, when 17 cases were reported. But by 1969, the number of registered cases had shot back up to more than half a million. Today Sri Lanka, like most other malarious countries, is still struggling to control the disease and has abandoned the goal of eradication. Overall, the eradication campaign showed little result outside the United States, Europe, and some parts of northern Africa.
Compounding the political failure of these early efforts was the emergence of drug resistance. As early as 1960, chloroquine-resistant strains of Plasmodium falciparum, the parasite that causes the most deadly form of malaria in humans, began to spread in Southeast Asia and South America. In Southeast Asia, resistance to second-generation drugs such as Fansidar emerged rapidly after their introduction for treatment of chloroquine-resistant infections. Resistance to both chloroquine and Fansidar has now spread to Africa and infections with multidrug-resistant P. falciparum are now common in many areas where the parasite is endemic.
To the extent that the parasite represents a moving target for drugs and insecticides, no single chemical compound is likely to defeat it. In the 1950s, we had the best weapons against malaria that we’ve ever had, yet we failed to control the disease. The failure of these early eradication campaigns teaches that multiple strategies – and a sustained commitment of significant resources – are required to solve the problem.
Over the past two decades, however, we have failed to apply that lesson. Indeed, past interventions have done more to eradicate funding for malaria research than they have to eliminate disease. In the initial years of the eradication effort, DDT appeared so promising that international agencies saw little need to study the disease further. Only in 1965, 15 years after the eradication program began, did WHO finally begin to encourage malaria research. The de-emphasis on science combined with a decline in the number of malariologists left countries ill prepared to deal with the crisis we face today.
Research on malaria remains severely underfunded. According to a 1996 report released by the Unit for Policy Research in Science and Medicine (PRISM) of the Wellcome Trust, a private charity that is one of the major sponsors of biomedical research, expenditures for malaria research equal about $42 per death, while expenditures for research on diseases like AIDS, cancer, or asthma are 100 to 1,000 times higher. According to the PRISM report, in 1993 only $84 million was spent on malaria research worldwide. The largest share – more than one-quarter – was devoted solely to vaccine development. Indeed, for decades, the hope of defeating malaria has rested largely on the belief that a vaccine for the disease is just around the corner.
The Puzzle of Partial Immunity
For 40 years, scientists have labored to create a vaccine against malaria, inspired by the success of this approach against smallpox, yellow fever, and polio. However, unlike these other types of infections, malaria induces only partial immunity in those who contract the disease. After multiple bouts of the disease, a person develops enough immunity to prevent severe infection and mortality, but can still become ill. Moreover, even this limited immunity can be maintained only by frequent reinfection. Thus the ideal vaccine – one that could provide full and permanent protection from illness – would have to perform better than natural immunity, an ambitious goal for which there is no precedent. If researchers were to succeed in developing a vaccine that mimics naturally occurring immunity, it could save millions of children’s lives. It would not, however, eradicate malaria the way vaccines have eradicated smallpox.
No one knows the mechanism by which people eventually become partially immune to malaria. It may be related to the complexity of the parasite. Recent work has demonstrated that the malaria parasite expresses a repertoire of thousands of surface molecules, or antigens, which change constantly in the course of a single infection. The parasite therefore presents a moving target for the host immune system: by the time the host forms antibodies in response to one antigen, the parasite has already switched to a new one.
Despite these difficulties, scientists have come very close to developing vaccines that work. Their efforts have focused on the three major phases of the parasite life cycle. The first type of vaccine targets the sporozoite, the form in which the parasite enters the host’s body, in order to prevent it from establishing infection. A second, known as the Spf66 vaccine, seeks to destroy the parasite only after it has invaded the host’s red blood cells – an approach that could establish partial but not full immunity. And the third type targets the oocyst, a stage in the life cycle of the parasite that occurs only in the mosquito. The aim of this so-called altruistic vaccine is to block transmission from human to human via the mosquito. When a mosquito feeds on the blood of a vaccinated human, she ingests not only the parasite but also the antibodies specific to the target antigens. These antibodies will prevent the parasite from developing and multiplying in the mosquito and subsequently being passed on to other humans. This kind of vaccine would not protect the vaccinated person from contracting malaria, but vaccinating enough people in a given area could substantially reduce the number of infective bites residents receive.
All of these vaccines have shown promising results in animals, but none have worked well in humans. No one knows why the parasite operates differently in humans than in animals, and it is not clear when or how we will bridge this gap in our knowledge. A 1991 report on malaria prevention and control by the Institute of Medicine (IOM) summarized the results of the malaria vaccine program with cautious optimism. What we have learned from these initial studies may help in designing the next generation of vaccines, but despite the talents, hopes, and funds devoted to this solution, an effective vaccine for malaria remains exactly where it has been for the past 40 years: tantalizingly out of reach.
Given that an effective vaccine is years if not decades away, and that the best vaccine is likely to have only limited impact in preventing illness, we need to step back and reemphasize both prevention and control. One very promising way to prevent malaria is to use mosquito nets treated with a safe, biodegradable pyrethroid insecticide to protect sleeping children. In the early 1990s, four large-scale, randomized, controlled community trials were conducted in four African countries representing different malaria risks – Burkina Faso, Ghana, Kenya, and the Gambia – to determine the impact of using treated nets on mortality rates of children younger than five. The three-year trials, conducted under the auspices of the U.N. Development Programme (UNDP), WHO, and the World Bank, involved nearly half a million people and 20 research institutes and donors. The results were dramatic: children’s deaths from all causes dropped 15 percent in Burkina Faso, 17 percent in Ghana, 33 percent in Kenya, and 25 percent in the Gambia. (An earlier trial in the Gambia cut child mortality 63 percent, but that study was based on 100 percent compliance. The larger studies evaluated the bed nets under real-life conditions, in which compliance ranged from 20 percent to 90 percent.) The magnitude of the reduction in mortality indicates either that malaria is the most important cause of death in the age groups included in the trials or that preventing malaria in young children somehow helps reduce their likelihood of dying from other diseases. These results suggest that if bed nets were made widely available, they could save the lives of up to 500,000 African children each year.
Bed nets are not yet widely available in Africa, and those that are cost between $25 and $30 each – well beyond the reach of the average family. However, locally manufactured nets could cost as little as $5; a year’s supply of insecticide costs between 50 cents and $1. Although this sum is still high relative to annual cash income (between $300 and $400 in some regions), there is reason to believe that most African families could afford it. According to WHO, African families spend up to $65 (or one-fifth of their income) each year on antimalarial drugs, mosquito coils, and insect repellents to protect themselves from malaria, with limited effect. If these expenditures could be redirected to reasonably priced insecticide-impregnated bed nets, overall family expenditures would actually decline.
WHO is studying ways to promote and distribute insecticide-impregnated bed nets. Like insecticide spraying, distributing bed nets is a long-term commitment, so the key is to ensure that this intervention is sustainable. Rather than simply donating nets, international agencies should support local initiatives or otherwise work with local manufacturers to ensure that bed nets are available to all children at a cost their families can afford. WHO and other international agencies should also fund the research and development of new insecticides that can safely be used on bed nets, since sooner or later vector mosquitoes are likely to become resistant to the pyrethroid insecticides.
From Headache to Coma
Although improvements in prevention are promising, diagnosis and drug treatment remain the mainstay of malaria control worldwide. The keys to effective disease management are rapid diagnosis and proper treatment. A patient with malaria who comes to a clinic in the morning complaining of a headache may, if untreated, fall into a coma by midafternoon. The longer an episode of malaria goes untreated or ineffectively treated, the higher the mortality rate.
In much of the developing world, however, primary health care systems are unable to provide early diagnosis and treatment. Diagnosing malaria is difficult because the symptoms of infection, particularly early on, are nonspecific – fever, chills, headache. In rural settings, both diagnosis and treatment are often delayed because of the long distances patients must travel to reach a health center. In addition, effective antimalarial drugs may not be available in areas of drug resistance.
Workers at primary health care clinics, where most children with malaria are likely to seek treatment, must be trained to differentiate among illnesses with overlapping symptoms, including pneumonia, measles, malaria, diarrhea, and malnutrition, all of which are leading causes of death in children under five in developing countries. Right now, health workers receive training through a variety of nationally administered programs, each of which focuses on the diagnosis and treatment of a single disease. Health workers are left to develop their own methods for differentiating among diseases and setting priorities for treatment. Focusing on the most apparent problem may cause health workers to overlook an associated, potentially life-threatening condition. A health worker might prescribe an antibiotic for a child with a high fever and rapid breathing in the belief that the child has pneumonia, for instance, without realizing that the child is severely ill with malaria, which shares the same complex of symptoms.
To address this problem, WHO and UNICEF have developed a new approach to diagnosis and treatment called Integrated Management of Childhood Illnesses, which they are implementing on an experimental basis in clinics and health posts in selected districts in Uganda, Tanzania, the Philippines, Vietnam, and Indonesia. The goal is to shift resources and responsibility for training primary health care workers from the national disease-specific programs to the district level and to help health care providers accurately assess the overall needs of the sick child. Health workers are also trained to communicate key information to mothers, thus helping them ensure the health of their children. So far, it appears to be working: preliminary evaluation shows that clinic staff trained under the new approach make more accurate diagnoses and more appropriate referrals for sick children.
Beyond improving diagnosis and referral, we need to ensure the availability of effective treatment. Antimalarial drugs are among the most commonly prescribed drugs in the world, and not only because the disease is so widespread; health workers in endemic areas often overprescribe antimalarials as a result of improper diagnoses. What’s more, the few drugs we have are closely related to one another, increasing problems with drug resistance.
Today, resistance is emerging and spreading faster than new drugs can be developed. The newest antimalarials-Malerone, developed by Glaxo-Wellcome, and drugs based on the ancient Chinese herbal remedy artemisinin – are the only drugs that remain effective in areas most plagued by drug resistance, such as Thailand. Parasites resistant to these compounds have already emerged in the laboratory, and health agencies are closely monitoring their emergence in the field.
Given the speed with which parasites are becoming resistant and the length of time required to develop new drugs (even accelerated development takes 5 to 10 years from discovery to clinic), we face a looming crisis: multidrug-resistant malaria with no safe, effective alternatives for treatment. This problem exists today in Southeast Asia and will occur in most other malaria-endemic areas within the next decade.
Despite the obvious urgency of the situation, pharmaceutical companies are not developing new drugs. Over the past decade, the few major pharmaceutical companies that had antimalarial drug discovery and development programs have discontinued or downsized these programs, which were both costly and unprofitable. Today, only a few academic centers and government agencies are working on the discovery of antimalarial drugs; only a few new drugs in the late stages of clinical development remain in the pipeline. (Malerone still awaits final approval in some countries and new formulations of artemisinin are still being tested.)
There is an urgent need to develop novel compounds or compounds that focus on novel pathways – processes essential to the growth and development of the parasite – that are not the target of current antimalarial drugs. Moreover, researchers must address the problem of drug resistance from the earliest stages of drug development. One strategy to prevent resistant organisms from emerging is to use multiple drugs targeted at different pathways, or at different steps in a single pathway. Another is to identify ways to interfere with the mechanisms that spur mutation or regulate gene expression. A third important target is a protein in the membrane of the parasite that allows the organism to recognize and pump out drugs. Once the protein is expressed, the parasite can resist multiple, unrelated drugs, even those to which it had not previously been exposed. Preventing the expression of this protein or blocking its pumping action is another way to prevent or even reverse resistance.
While previous efforts to develop new drugs and vaccines have been based on empirical observations, new drug development techniques are likely to provide the basis for the next leaps in our knowledge. Techniques such as combinatorial chemistry (creating new compounds by systematically combining sets of chemical groups) and computer-aided drug design (using x-ray crystallography and computer modeling to analyze protein-drug interactions) have accelerated the process of finding new drugs and vaccines for other diseases. We need to apply these tools to malaria before we face the widespread threat of untreatable disease. Since expertise in drug development is concentrated in the private sector, the participation of the pharmaceutical industry – perhaps supported by government funding – is essential to the success of this effort.
Back to Basics
Unfortunately, knowledge of the fundamental biology of the malaria parasite has lagged behind that of many other organisms, largely because of a lack of funding for basic research in this area. Computer-aided drug design or combinatorial chemistry cannot be applied in a vacuum: researchers need to identify the biological processes or specific enzymes that can serve as targets for new drugs or vaccines. To take advantage of these techniques, we need to learn more about the genetic makeup of the organism, its growth and development, and its relationships with the mosquitoes that carry it, the human host, and the environment. For example, we now know that the malaria parasite digests hemoglobin as its source of amino acids. Studying ways to prevent the digestion of hemoglobin could lead to new drugs.
Genetics research promises new insight into mechanisms by which the malaria parasite operates and might be defeated. For instance, techniques for producing parasites that contain a specific modification in a single gene will allow scientists to determine whether specific genes are associated with virulence, drug resistance, or enhanced transmission. This promises to usher in a period of rapid growth in our knowledge of the fundamental processes in the parasite.