KEMRI-Wellcome Trust Researchers have unveiled the largest repository of malaria survey data in Africa, covering over 50,000 surveys across 115 years starting in 1900, each documented by date, geolocation, number of people, it includes 7.8 million blood samples from more than 30,000 locations in 43 countries and the proportion positive for Plasmodium falciparum infection. The paper is featured in Nature.
The researchers analysed the data to estimate malaria infection prevalence for each of 520 administrative units of Sub-Saharan African Countries and Madagascar for 16 time periods since 1900 through to 2010-2015.
Lead Author Professor Bob Snow says:
“People often focus on recent history in tracking malaria in Africa, to inform donors and control programmes on recent actions. The longer history of malaria in Africa allows us to put into context the recent decline.”
Co-Author Abdisalan Noor adds:
“Shown in context, the cycles and trend over the past 115 years are inconsistent with explanations in terms of climate or deliberate intervention alone. The role of socio-economic development, for example, remains poorly understood”
The study reveals that the biggest historical drops in malaria followed the second world war with the discovery of DDT and chloroquine; and then in 2005 with the rolling out of insecticide treated bed nets and new drugs to treat malaria.
Malaria prevalence was low during the late 1960s, through the 1970s and early 1980s. This was a period when, despite the international community abandoning investment in malaria control in Africa, chloroquine use was widespread with repeated dosing available to the general population. Together with drought across the Sahel, this produced the perfect lull in malaria transmission.
Chloroquine resistance expanded across Africa in the 1980s, and in the late 1990s unprecedented rainfall led to flooding and major malaria epidemics. Ministries of Health across the continent woke up to the perfect storm without any significant mosquito vector control in place. Malaria prevalence returned to the levels seen before the second world war.
It took a further five years for the international community to provide free insecticide treated bed nets and effective malaria treatments. The financial response by the Global Fund and the technical revisions to policy by the World Health Organization after 2005 led to one of the largest drops in malaria infection prevalence witnessed since 1900.
The findings urge caution the projecting of the future of malaria in Africa. The current prevalence of infection, 24%, is at its lowest in 115 years but gains have stalled since 2010 and 240 million infected individuals remains a substantial burden. Little has changed in the high transmission belt across West and Central Africa. Emerging insecticide and drug resistance and growing international ambivalence to funding control further increases the risk.
“The history of malaria risk in Africa is complex, there have been perfect lulls when drugs worked and droughts prevented mosquito’s transmission infection; there have been perfect storms when drugs stopped working and flooding affected large parts of Africa. It has been a history of long term cycles and predicting the future of malaria in Africa based on climate or intervention coverage alone is difficult” says Snow.
The researchers urge for new tools for the poor and high malaria burden areas of Africa, a focus on eliminating malaria in the low burden margins of Southern Africa, or small islands across the world or run the risk that high burden countries in Africa being ignored and left behind. The 115-year history shows that malaria in Africa is complex and predicting the future malaria based on climate or economic development alone would be foolhardy.
The research was led by Professor Bob Snow and Abdisalan Noor of the KEMRI -Wellcome Trust Research Programme and University of Oxford.
Snow, Sartorius et al. The prevalence of Plasmodium falciparum in sub-Saharan Africa since 1900. Nature. 2017 Oct 11. doi: 10.1038/nature24059. [Epub ahead of print]