Transfusion-Transmitted Malaria in Sub-Saharan Africa
Transfusion-Transmitted Malaria in Sub-Saharan Africa
The ability to screen blood donations, as well as donors, can significantly decrease any risk of TTM. Laboratory screening for malaria remains the possible option for reducing transfusion malaria. There are four specific targets for donation screening: intracellular parasites, plasmodial antibodies, plasmodial antigen, and plasmodial DNA.
In routine practice, the "gold-standard" technique, optical microscopy in thick blood smears, is the most often used for Plasmodium detection in malaria-endemic areas. This technique is considered the most effective and inexpensive for the diagnosis of malaria. Its sensitivity varies depending on the expertise of the microscopist. In experienced hands, sensitivities of 5-50 parasites/μL can be achieved, but in routine practice most laboratories achieve a lower sensitivity of around 500 parasites/μL. Further, a single parasite identified by microscopic evaluation of a thick blood film (4 mL) is equivalent to almost 10,000 parasites in a 450 mL unit of blood. But despite their continued application as key diagnostic tests, microscopy techniques have some major limitations that render them inappropriate for universal or targeted donor screening. Precisely, they lack the required sensitivity and specificity to detect all infected units, specifically in situations of low parasite density, hence presenting a real transfusion risk for the recipient. In addition, they are time-consuming (generally requiring one hour or more for preparation and detailed examination), are inadequate for examining a large volume of samples, and do require considerable expertise and specialized equipment when fluorescent methods are used. This hinders a rapid evaluation, particularly in a blood transfusion service. Finally, post-transfusion malaria cases have been reported in recipients of blood that has been tested negative by microscopy. Microscopy detection of malaria parasites is consequently likely to significantly underestimate the prevalence of parasitaemia in blood donations, and appears not sensitive enough to be recommended as the suitable screening test for transfusion services in SSA malaria-endemic settings.
Alternative methods have been developed for the screening of malaria for use both in areas where malaria is endemic and in areas where it is not, detecting specific Plasmodium antigens or antibodies directed against the Plasmodium. The detection of malarial antigen with rapid diagnostic tests (RDTs) was originally intended as a more rapid and objective alternative to direct microscopy. RDTs detect Plasmodium-specific parasite proteins, such as pan-malarial lactate dehydrogenase (pLDH), and P. falciparum specific histidine-rich protein 2 (HRP2). Most of these assays are in a 'dipstick' format that can be used with minimal training, are field applicable, and provide a result within 10-20 minutes. However, RDTs methods have not offered improved sensitivity over microscopy, and their sensitivity decreases as parasitaemia falls below 100 parasites/μL. Also, false positives are observed, especially after treatment, as the parasite antigens detected can remain in the circulation following parasite clearance, this being especially the case for the HRP2 antigen-based tests. Eventually, current RDTs are either specific to P. falciparum, or they cannot distinguish between the parasite species present. Heutmekers et al. using a RDT CareStart pLDH showed for instance that overall sensitivity for P. falciparum and P. vivax was good, but poor for P. ovale and P. malariae. In order to increase the likelihood to detect all types of Plasmodium species, it has been proposed a combined HRP2/pLDH-based RDT. Heutmekers et al. showed on another hand that false-negative results mainly occurred at parasite density less than 100/μL. Atchade et al. claimed that the pLDH-based RDT can exhibit a detectability threshold of 1 parasite/μL, lower than that of the other methods, with the exception of PCR, and that unlike HRP2-based tests, false positives are exceptional with pLDH-based RDTs. Based on these findings, the pLDH antigen detection for Plasmodium species could be an interesting tool for blood donation qualification in order to ensure blood safety in malaria-endemic areas. It has been noticed nonetheless that seasons influenced pLDH prevalence. What's more, if the donors had taken self-treatment measures prior to blood donation such as drugs or herbal teas, malaria infection was masked and pLDH detection failed.
Following infection with Plasmodium species, the immune response results in the formation of specific antibodies not necessarily protective, and not necessarily indicating that the person is harbouring malaria parasites as well. Antibody detection assays demonstrate high antibody levels and good sensitivity in semi-immune individuals, the very donors who are potentially at high risk of acting as a source of TTM by being asymptomatic but parasitaemic. After having compared the IFAT assay with the DiaMed ELISA malaria antibody test, Ehghouzzi et al. showed that the latter was more sensitive and specific than the former, with a possibility of automation, fulfilling thereby the criteria of a satisfactory and reliable malaria screening test. However, a negative malarial antibody test cannot guarantee that the donor is not infected with malaria parasites, as the antibodies may not be detectable in the first few days of malarial illness, and infection with P. ovale and P. malariae may not be detected by P. falciparum and P. vivax antigen-based assays. Given the potential for malaria parasites to persist in certain patients for some years, it is worth noting that in individuals who have suffered repeated attacks of malaria, anti-malarial immunoglobulins may be detectable for several years. Even though the persistence of antibodies long after cure of the malarial infection would lead to some individuals, who are no longer parasitaemic, to be excluded as potential blood donors, it does provide a useful margin of safety if candidate donors, who are malaria-antibody positive, are excluded from donating. Malaria antibody tests are therefore useful in non-endemic areas where they will result in rejecting blood donation in case of positivity, but these assays are of no use in malaria endemic areas. Indeed, malaria antibody prevalence is very high: 87% in Benin, and 65.33% in Senegal. This would lead to a high rate of blood donor deferral as most of the populations in endemic zones harbour anti-malarial immunoglobulins.
Methods based on molecular biology have been used to detect different types of Plasmodium by PCR, such as the nested PCR. This technique is based on the amplification of a fragment of the small subunit ribosomal RNA of the parasite and has been used for the diagnosis of malaria for research purpose and reference laboratories. The PCR technique can detect parasites below the threshold levels of microscopy. Indeed, when performed under optimal conditions, PCR can detect parasitaemia as low as 0.004 to 1 parasite/μL of blood. However, the result directly depends on the quality of the genetic material (DNA) of the parasite obtained during extraction and amplification, and on the quality of the reagents. Furthermore, the test is very expensive, requires extensive training and a long analysis time, restricting thereby its usage as a routine diagnostic test for malaria in SSA laboratories or blood banks. Real-time PCR is considered at the moment to be the best molecular biology technique for the diagnosis of malaria. It prevents ambiguous results because it does not require agarose gels, minimizes manual work, reduces pipetting errors, performs well under high throughput, and provides quantitative results of parasite density. Batista-dos-Santos et al. have described real time PCR as a necessary, appropriate and inexpensive method, with higher sensitivity and specificity compared to those previously described, which can be adopted as part of the laboratory screening in haemotherapy centres, especially in malaria-endemic areas. Owing to its prohibitive cost as well as the fact that the infrastructure needed is scarcely available in SSA malaria-endemic zones, which by the way are almost all resource-limited settings, PCR is not currently, or in the foreseeable future, a viable alternative for the screening of blood donations. PCR may be probably best used in a stepwise fashion when other testing modalities are non-diagnostic but when the index of suspicion for malaria is high.
A recently available technique based on detection of haemozoin pigment in white blood cells by automated haematology cell counters has been described as a convenient, less costly and objective malaria screening method. According to the review from Campuzano-Zaluaga et al., the accuracy for malaria diagnosis using automated haematology analysis may vary according to species, parasite load, immunity and clinical context where the method is applied. Its overall sensitivity ranges from 48.6% to 100%, and its specificity, from 25.3% to 100%. The sensitivity has been shown to decrease down to 50% with parasitaemia of less than 0.1%. Another factor tempering automated haematology analysis utilization is that laboratory staff ought to receive appropriate and continuous training allowing them to recognize malaria-related changes during validation of cell blood count results.
Apart from laboratory screening, donor questioning has also been proposed as another tool for malaria screening in order to lessen the risk of TTM. It aims at deferring all potential blood donors who have experienced a febrile episode at least three months before blood donation. But this strategy lacks the capacity of eliminating asymptomatic but malaria-parasitaemic blood donors. To be useful, the medical selection through donor questionnaire must be integrated in an algorithm including other screening tools.
Overall, a number of factors need to be considered in selecting the most appropriate assays. In general, a balance has to be found between screening needs and the resources available, including finances, staff and their level of expertise, equipments, consumables, and disposables. Each screening system has its advantages and limitations that should be taken into consideration when selecting assays. Some limitations include: (i) the length of time following infection before the screening test becomes reactive (window period), (ii) rates of biological false positives which may result in the wastage of donations and unnecessary deferral of donors, and (iii) the complexity of some systems that require automation.
According to WHO guidelines, the minimum evaluated sensitivity and specificity levels of all assays used for blood screening should be as high as possible and preferably not less than 99.5%. But, we have seen from what precedes that the sensitivity of the most currently used methods for malaria detection in SSA blood units is much lower than this required threshold, so as to detect level of parasitaemia capable of causing TTM (approximately 0.00004 parasites/μL or 1-10 parasites/unit of blood). In fact, Owusu-Ofori et al. after performing thick blood films, RDT, enzyme immunoassays and real time PCR for malaria screening of blood donor units concluded that none of these four tests would be ideal for African blood banks to be used for the prevention of TTM, as they were either insufficiently sensitive or too sensitive for malaria parasites detection in blood donor units.
Screening for Transfusion-Transmitted Malaria
The ability to screen blood donations, as well as donors, can significantly decrease any risk of TTM. Laboratory screening for malaria remains the possible option for reducing transfusion malaria. There are four specific targets for donation screening: intracellular parasites, plasmodial antibodies, plasmodial antigen, and plasmodial DNA.
In routine practice, the "gold-standard" technique, optical microscopy in thick blood smears, is the most often used for Plasmodium detection in malaria-endemic areas. This technique is considered the most effective and inexpensive for the diagnosis of malaria. Its sensitivity varies depending on the expertise of the microscopist. In experienced hands, sensitivities of 5-50 parasites/μL can be achieved, but in routine practice most laboratories achieve a lower sensitivity of around 500 parasites/μL. Further, a single parasite identified by microscopic evaluation of a thick blood film (4 mL) is equivalent to almost 10,000 parasites in a 450 mL unit of blood. But despite their continued application as key diagnostic tests, microscopy techniques have some major limitations that render them inappropriate for universal or targeted donor screening. Precisely, they lack the required sensitivity and specificity to detect all infected units, specifically in situations of low parasite density, hence presenting a real transfusion risk for the recipient. In addition, they are time-consuming (generally requiring one hour or more for preparation and detailed examination), are inadequate for examining a large volume of samples, and do require considerable expertise and specialized equipment when fluorescent methods are used. This hinders a rapid evaluation, particularly in a blood transfusion service. Finally, post-transfusion malaria cases have been reported in recipients of blood that has been tested negative by microscopy. Microscopy detection of malaria parasites is consequently likely to significantly underestimate the prevalence of parasitaemia in blood donations, and appears not sensitive enough to be recommended as the suitable screening test for transfusion services in SSA malaria-endemic settings.
Alternative methods have been developed for the screening of malaria for use both in areas where malaria is endemic and in areas where it is not, detecting specific Plasmodium antigens or antibodies directed against the Plasmodium. The detection of malarial antigen with rapid diagnostic tests (RDTs) was originally intended as a more rapid and objective alternative to direct microscopy. RDTs detect Plasmodium-specific parasite proteins, such as pan-malarial lactate dehydrogenase (pLDH), and P. falciparum specific histidine-rich protein 2 (HRP2). Most of these assays are in a 'dipstick' format that can be used with minimal training, are field applicable, and provide a result within 10-20 minutes. However, RDTs methods have not offered improved sensitivity over microscopy, and their sensitivity decreases as parasitaemia falls below 100 parasites/μL. Also, false positives are observed, especially after treatment, as the parasite antigens detected can remain in the circulation following parasite clearance, this being especially the case for the HRP2 antigen-based tests. Eventually, current RDTs are either specific to P. falciparum, or they cannot distinguish between the parasite species present. Heutmekers et al. using a RDT CareStart pLDH showed for instance that overall sensitivity for P. falciparum and P. vivax was good, but poor for P. ovale and P. malariae. In order to increase the likelihood to detect all types of Plasmodium species, it has been proposed a combined HRP2/pLDH-based RDT. Heutmekers et al. showed on another hand that false-negative results mainly occurred at parasite density less than 100/μL. Atchade et al. claimed that the pLDH-based RDT can exhibit a detectability threshold of 1 parasite/μL, lower than that of the other methods, with the exception of PCR, and that unlike HRP2-based tests, false positives are exceptional with pLDH-based RDTs. Based on these findings, the pLDH antigen detection for Plasmodium species could be an interesting tool for blood donation qualification in order to ensure blood safety in malaria-endemic areas. It has been noticed nonetheless that seasons influenced pLDH prevalence. What's more, if the donors had taken self-treatment measures prior to blood donation such as drugs or herbal teas, malaria infection was masked and pLDH detection failed.
Following infection with Plasmodium species, the immune response results in the formation of specific antibodies not necessarily protective, and not necessarily indicating that the person is harbouring malaria parasites as well. Antibody detection assays demonstrate high antibody levels and good sensitivity in semi-immune individuals, the very donors who are potentially at high risk of acting as a source of TTM by being asymptomatic but parasitaemic. After having compared the IFAT assay with the DiaMed ELISA malaria antibody test, Ehghouzzi et al. showed that the latter was more sensitive and specific than the former, with a possibility of automation, fulfilling thereby the criteria of a satisfactory and reliable malaria screening test. However, a negative malarial antibody test cannot guarantee that the donor is not infected with malaria parasites, as the antibodies may not be detectable in the first few days of malarial illness, and infection with P. ovale and P. malariae may not be detected by P. falciparum and P. vivax antigen-based assays. Given the potential for malaria parasites to persist in certain patients for some years, it is worth noting that in individuals who have suffered repeated attacks of malaria, anti-malarial immunoglobulins may be detectable for several years. Even though the persistence of antibodies long after cure of the malarial infection would lead to some individuals, who are no longer parasitaemic, to be excluded as potential blood donors, it does provide a useful margin of safety if candidate donors, who are malaria-antibody positive, are excluded from donating. Malaria antibody tests are therefore useful in non-endemic areas where they will result in rejecting blood donation in case of positivity, but these assays are of no use in malaria endemic areas. Indeed, malaria antibody prevalence is very high: 87% in Benin, and 65.33% in Senegal. This would lead to a high rate of blood donor deferral as most of the populations in endemic zones harbour anti-malarial immunoglobulins.
Methods based on molecular biology have been used to detect different types of Plasmodium by PCR, such as the nested PCR. This technique is based on the amplification of a fragment of the small subunit ribosomal RNA of the parasite and has been used for the diagnosis of malaria for research purpose and reference laboratories. The PCR technique can detect parasites below the threshold levels of microscopy. Indeed, when performed under optimal conditions, PCR can detect parasitaemia as low as 0.004 to 1 parasite/μL of blood. However, the result directly depends on the quality of the genetic material (DNA) of the parasite obtained during extraction and amplification, and on the quality of the reagents. Furthermore, the test is very expensive, requires extensive training and a long analysis time, restricting thereby its usage as a routine diagnostic test for malaria in SSA laboratories or blood banks. Real-time PCR is considered at the moment to be the best molecular biology technique for the diagnosis of malaria. It prevents ambiguous results because it does not require agarose gels, minimizes manual work, reduces pipetting errors, performs well under high throughput, and provides quantitative results of parasite density. Batista-dos-Santos et al. have described real time PCR as a necessary, appropriate and inexpensive method, with higher sensitivity and specificity compared to those previously described, which can be adopted as part of the laboratory screening in haemotherapy centres, especially in malaria-endemic areas. Owing to its prohibitive cost as well as the fact that the infrastructure needed is scarcely available in SSA malaria-endemic zones, which by the way are almost all resource-limited settings, PCR is not currently, or in the foreseeable future, a viable alternative for the screening of blood donations. PCR may be probably best used in a stepwise fashion when other testing modalities are non-diagnostic but when the index of suspicion for malaria is high.
A recently available technique based on detection of haemozoin pigment in white blood cells by automated haematology cell counters has been described as a convenient, less costly and objective malaria screening method. According to the review from Campuzano-Zaluaga et al., the accuracy for malaria diagnosis using automated haematology analysis may vary according to species, parasite load, immunity and clinical context where the method is applied. Its overall sensitivity ranges from 48.6% to 100%, and its specificity, from 25.3% to 100%. The sensitivity has been shown to decrease down to 50% with parasitaemia of less than 0.1%. Another factor tempering automated haematology analysis utilization is that laboratory staff ought to receive appropriate and continuous training allowing them to recognize malaria-related changes during validation of cell blood count results.
Apart from laboratory screening, donor questioning has also been proposed as another tool for malaria screening in order to lessen the risk of TTM. It aims at deferring all potential blood donors who have experienced a febrile episode at least three months before blood donation. But this strategy lacks the capacity of eliminating asymptomatic but malaria-parasitaemic blood donors. To be useful, the medical selection through donor questionnaire must be integrated in an algorithm including other screening tools.
Overall, a number of factors need to be considered in selecting the most appropriate assays. In general, a balance has to be found between screening needs and the resources available, including finances, staff and their level of expertise, equipments, consumables, and disposables. Each screening system has its advantages and limitations that should be taken into consideration when selecting assays. Some limitations include: (i) the length of time following infection before the screening test becomes reactive (window period), (ii) rates of biological false positives which may result in the wastage of donations and unnecessary deferral of donors, and (iii) the complexity of some systems that require automation.
According to WHO guidelines, the minimum evaluated sensitivity and specificity levels of all assays used for blood screening should be as high as possible and preferably not less than 99.5%. But, we have seen from what precedes that the sensitivity of the most currently used methods for malaria detection in SSA blood units is much lower than this required threshold, so as to detect level of parasitaemia capable of causing TTM (approximately 0.00004 parasites/μL or 1-10 parasites/unit of blood). In fact, Owusu-Ofori et al. after performing thick blood films, RDT, enzyme immunoassays and real time PCR for malaria screening of blood donor units concluded that none of these four tests would be ideal for African blood banks to be used for the prevention of TTM, as they were either insufficiently sensitive or too sensitive for malaria parasites detection in blood donor units.
Source...