scholarly journals Efficacy of Blood Donor Selection: Comparing Sero-Prevalence of Transfusion-Transmissible Infections Among Eligible and High-Risk Behavior Deferred Donors in Iran

2020 ◽  
Vol 20 (11) ◽  
Author(s):  
Sara Riyahi ◽  
Sedigheh Amini-Kafiabad ◽  
Daryoush Minai Tehrani ◽  
Mahtab Maghsudlu ◽  
Seyed Moayed Alavian

Background: Eliminating high-risk individuals has a special role in ensuring blood safety. Due to epidemiological, demographic, and even cultural changes in each country, this process should be continuously evaluated and reviewed, if necessary. Objectives: This study aimed to evaluate the impact of the current donor selection procedure on blood safety in Iran. Methods: A total of 2,525 high-risk deferred donors who were referred between 2018 and 2019 were evaluated regarding hepatitis B surface antigen, hepatitis C virus antibody, and human immunodeficiency virus antigen and antibody. All repeatedly reactive samples were evaluated by confirmatory tests. Characteristics' parameters, donor status, and TTI marker rates of the participants and 1,315,871 eligible donors in the indicated period obtained from the national database on blood donors, were compared. Data were analyzed using SPSS version 24.0. Results: The prevalence of HBV, HCV, and HIV in 100,000 deferred donors was 1148, 515, and 119, respectively. This prevalence was 26, 28, and 33-times higher than the eligible donors, respectively. Unlike HBV, its prevalence among males was almost twice that of females among the deferred group. In the eligible group, females had a higher prevalence for HBV and HCV as compared to males. The HCV and HBV (6.7 and 4.3-fold) among deferred first-time donors had a significantly higher prevalence compared with the eligible first-time donors (P‐value < 001). Notably, the higher was the education degree, the lower was the prevalence of infection in both groups. Conclusions: Current deferral criteria and donor selection procedure in Iran are an opportunity to eliminate high-risk individuals from the blood donation.

Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 1124-1124
Author(s):  
Amanda Skulte ◽  
Arindam Mitra ◽  
Simon Thomas ◽  
Matthew Cobb ◽  
Alka Stansfield ◽  
...  

Abstract Objective Investigator-led Phase I/II studies have shown that adoptive transfer of donor virus-specific cytotoxic T cell lymphocytes (CTL) can reconstitute immunity on a long-lasting basis in patients at risk for viral reactivation following allogeneic hematopoietic stem cell transplantation (allo HSCT). In an effort to provide a consistent high-purity product, it is critical to detect patterns of donor material which may predict final product quality and consequently CMV protection. Methods We performed two prospective studies of cytomegalovirus-specific (CMV) CTL in the UK, using a proprietary technology for direct selection of CTL based on MHC class I-multimer (Streptamer®) binding (STAGE Cell Therapeutics GmbH). Eligible donors were pre-screened for the presence of selectable CMV-specific cells by immunophenotyping with flow cytometry. Eligible HLA alleles for this selection procedure are A*0101, A*0201 (two epitopes, IE-1 and PP65), A*2402, B*0702 and B*0801. In our experience, a minimum threshold of 0.1% CD8+Streptamer+ cells of total CD3+ T cells ensures a successful selection. The table summarizes the manufacturing data from 39 products, showing that despite a large variety of donor material, these cells can be significantly enriched to produce a viable, CMV-specific T cell product. Table(n=39)MeanRange% CMV + cells in donor starting material (of CD8+)1.63%0.12 – 13.3 %% CMV+ cells in final product (of CD3+)78.59%4.48 – 99.56 %% CD3+ cells in final product62.16%10.20 – 98.51 %% viability in final product94.41%76.86 – 98.88 %T cell dose /kg infused2.45x1047.23x102 – 5.35x104Total number of T cells infused1.79x1065.63x104 – 4.20x106 Results Pre-screen results have been analyzed from 56 eligible donors. Of these donors, 40 were screened for more than one eligible HLA type. In the presence of more than one eligible allele, selection was based on the most abundant population of CD8+Streptamer+ cells. The majority of donors screened were confirmed as eligible HLA type A02 (33/56), closely followed by A01 and B07 (both 22/56). However the alleles found to be the most dominant for Streptamer binding (giving the highest percentage of positive cells in these pre-screens) were A01 and B07 (both 11/20). Thus while A02 is a more frequent allele in the studied donor population, it is often subdominant in the presence of another Streptamer-eligible allele. Where A01 is present in combination with one other allele, A01 is more frequently dominant. We also investigated whether the allele chosen for the selection procedure was predictive of the purity (CD3+Streptamer+) of the final CMV CTL product. We examined 39 selections. The majority of selections (22/39) were performed on A02-PP65. The selections produced a varying product with a purity range of 4.48% - 99.56% (SD 30.9%). Our data suggests that A01 and B07 are more indicative of a high purity CMV selection, however this was not demonstrated beyond statistical significance when compared to A02-PP65 (A01, p-value=0.34, B07, p-value=0.15). Other factors were observed that influence product purity. The percentage of CD3+ cells and the total number of Streptamer positive cells in the starting material were also key variables for predicting the purity of the selection. Pre-screen results were not found to have a direct correlation with product purity. We investigated whether a successful pre-screen can be predicted on the presence of a single HLA type, regardless of other alleles present. A donor is least likely to have a selectable population for A24 or A02-IE-1. A01(median pre-screen 0.61%) and B07 (median pre-screen 0.53%) donors are more likely to have a positive pre-screen compared to the other alleles. Conclusion Cell Medica is developing a model for centralized production and delivery of CMV-specific CTL for transplant centers across Europe. Interrogation of historical data demonstrates important patterns for donor selection and effective pre-screen methods to ensure a consistent and high quality cell product. Further investigation is required to determine the impact of donor selection and product purity on CMV protection in patients. In addition, functional analysis of the specific cells may also be an important factor to consider in the donor and selectable epitope screening process. Disclosures Skulte: Cell Medica: Employment. Mitra:Cell Medica: Employment. Thomas:Cell Medica Ltd: Employment. Cobb:Cell Medica Ltd: Employment. Stansfield:Cell Medica: Employment. Newton:Cell Medica Ltd: Employment.


Blood ◽  
2005 ◽  
Vol 106 (11) ◽  
pp. 871-871 ◽  
Author(s):  
Carmelo Rizzari ◽  
Maria Grazia Valsecchi ◽  
Paola De Lorenzo ◽  
Maurizio Aricò ◽  
Giuseppe Basso ◽  
...  

Abstract Introduction: Cure rates of ALL in children aged less than one year (i.e. infants) at diagnosis are in the range of 35–40%. Encouraging results have been recently reported in infants by using intensified treatment, including high dose chemotherapy, with or without allogeneic hematopoietic stem cell transplantation (HSCT) in first complete remission (CR). Aim: To evaluate the impact of the two treatment strategies adopted in the AIEOP ALL 91 and 95 studies on the outcome of ALL in infants. Patients and Methods: Fifty-two infants with ALL were enrolled between 1991 and 1999 in two consecutive studies, named AIEOP ALL 91 and ALL 95. Infants with an identified t(4;11) translocation had to be included in the high risk (HR) groups whilst those without this genetic abnormality could be treated in the intermediate (IR) or HR groups according to presenting features and treatment response. Patients belonging to the IR groups received a traditional BFM back-bone based treatment (protocols I, M and II), while those classified in the HR groups underwent an tensified treatment including induction (BFM protocol IA only, in study AIEOP ALL 91, and IA+IB in study ALL 95), consolidation with either 9 blocks of non-cross-resistant drugs (ALL 91) or 3 blocks followed by the 8-drug reinduction regimen - BFM protocol II - repeated twice (ALL 95). All patients were given a continuation phase (reinforced in HR patients of study ALL 95 by vincristine/prednisone pulses). Overall treatment duration was 2 years in both studies. Results: Infants in studies ALL 91 (n=21) and ALL 95 (n=31) had similar biological and clinical characteristics. The overall event-free survival (EFS) at 5 years was 45.0% (SE 7.0%). The EFS, after censoring for HSCT in 1st CR, was 38.1% (SE 11.4%) in ALL 91 and 51.6% (SE 9.9%) in ALL 95 (p-value=0.29). Patients treated in the IR arm of the two studies had a similar outcome. Better results were obtained in patients treated in the HR arm of ALL 95 study, where 9/17 chemotherapy-only patients and 3/4 HSCT patients are alive in CCR as compared to 1/7 and 0/2, respectively, in patients treated in the ALL 91 study. Discussion: These data show that full traditional BFM therapy intensified by 3 post-induction chemotherapy blocks and double protocol II (adopted in study ALL 95), is associated with a better outcome in infants with HR ALL.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 3079-3079 ◽  
Author(s):  
Constantine S. Tam ◽  
Michael J. Keating ◽  
Apostolia M. Tsimberidou ◽  
Susan O’Brien ◽  
Alessandra Tsimberidou ◽  
...  

Abstract In order to develop integrated models utilizing commonly available prognostic factors, we studied the clinical signficance of IGVH mutation, CD38 and ZAP-70 in 477 CLL patients (pts) with low-risk (non-11q, non-17p) FISH findings. All pts were untreated at the time of FISH assessment, and were collected prospectively in the MD Anderson CLL database. Two hundred & fifteen pts (45%) had mono- (n=160) or bi-alleleic (n=55) deletion of 13q {DEL13Q}, 162 pts (34%) had a negative FISH panel {NEG}, and 100 pts (21%) had trisomy 12 as sole FISH abnormality (n=78) or in association with deletion 13q (n=22) {T12}. Compared to other FISH groups, DEL13Q pts had lower B2m (median 2.2 v 2.6mg/L, p=0.01) and were less likely to be IGVH unmutated (33% v 48%, p=0.001). In contrast, T12 pts were more likely to present with advanced stage disease (Rai≥2 36% v 23%, p=0.01), be CD38 positive (44% v 13%, p<0.001), and have karyotypic abnormalities (48% v 7%, p<0.001). One hundred and twenty-three pts had active disease requiring immediate therapy and 354 pts had stable disease, of whom 291 were evaluable for disease progression. At a median follow-up of 20 months, 73 pts had developed active disease with NCI-WG indication(s) for treatment. Actuarial 2 year time to treatment (TTT) was 26%, with no significant difference between 13q, NEG and T12 pts (p=0.27). TTT was associated with elevated B2m (≥1.5ULN), IGVH mutation status and ZAP-70 in DEL13Q and NEG pts, but not in T12 patients (Table). For DEL13Q/NEG pts, a simple model using IGVH mutation and B2m separated high risk pts (unmutated or high B2m, 2yr TTT 43%) from standard risk pts (mutated and low B2m, 2yr TTT 11%, p<0.0001). For T12 pts, a model based on CD38 positivity and karyotypic abnormalities separated high risk pts (2 factors, 2yr TTT 75%) from standard risk pts (0 or 1 factor, 2yr TTT 15%, p=0.008). These results show that the impact of prognostic factors on TTT is dependent on the underlying FISH karyotype, and underscores the need for future studies in CLL prognostic factors to take into account the complete risk profile of the pt. NEGATIVE FISH DELETION 13Q TRISOMY 12 p-value hazard ratio p-value hazard ratio p-value hazard ratio IGVH Mutation <0.001 8.0 0.003 2.9 0.97 0.98 B2m ≥1.5ULN <0.001 4.5 0.07 2.2 0.54 0.68 CD38 Positivity 0.05 2.5 0.05 2.4 0.06 7.4 Abn Cytogenetics <0.001 11.0 0.27 2.2 0.09 2.8 ZAP-70 0.02 2.9 0.007 3.1 0.70 1.3 Figure Figure Figure Figure


Blood ◽  
2017 ◽  
Vol 130 (Suppl_1) ◽  
pp. 848-848
Author(s):  
Bronwen E. Shaw ◽  
Brent R. Logan ◽  
Stephen R. Spellman ◽  
Steven GE Marsh ◽  
James Robinson ◽  
...  

Abstract Background. There is no hierarchical algorithm that weights the characteristics of individual donors against each other in a quantitative manner to facilitate donor selection when multiple potential equally HLA-matched unrelated donors (URD) are available. Donor factors, such as age, sex, CMV status, ABO type, and matching of secondary HLA loci (DQB1, DPB1), have been associated with recipient survival in URD hematopoietic cell transplantation (HCT) although the impact of specific factors has varied among studies. The goal of this study was to develop and validate a donor selection score that prioritizes donor characteristics associated with better survival in 8/8 HLA-matched URD transplantation. Methods. Two large CIBMTR patient datasets were studied: HCT from 1999-2011 (n=5952) and 2012-2014 (n=4510). Patients were adults (&gt;18), transplanted for acute myelogenous leukemia (AML), acute lymphocytic leukemia (ALL), chronic myelogenous leukemia (CML), or myelodysplastic syndrome (MDS). Each dataset was randomly split for the analysis. Cohort 1 (c1): 2/3 (n=3969) for modeling/score development (training) and 1/3 (n=1983) for testing and similarly for cohort 2 (c2): 2/3 (n=3051) and 1/3 (n=1459). Thus, two independent models were built and tested, adjusting for significant patient characteristics associated with survival. Interactions between donor characteristics, and donor and recipient characteristics were tested. The following donor characteristics were considered for the donor score: HLA-DQB1 matching, HLA-DPB1 matching (using the T-cell epitope matching categorization), age, sex matching, parity, CMV matching, ABO matching and race matching. Results. In the final survival model (training set from 1999-2011, c1) we found significant negative associations with survival for three donor risk factors: non-permissive DPB1 matching (HR 1.13; 95% CI 1.01, 1.26; p-value=0.032), older donor age (as a linear effect, HR 1.07 per decade increase in age; 95% CI 1.02, 1.12, p-value=0.004), and CMV mismatching for CMV+ recipients (HR 1.14; 95% CI 1.02, 1.27; p-value=0.022). For CMV- recipients, a CMV+ donor was not significantly associated with an increase in mortality (HR=1.03; 95% CI 0.89-1.20; p-value=0.68), so this was not included in the score. ABO mismatching (any type: major, minor or bidirectional) was associated with mortality in initial modelling, but the effect was not present in more recent transplants (HR for ABO mismatch among patients transplanted since 2007: 1.04; 95% CI 0.91-1.19; p-value=0.638), so it was not included in the final model and donor score. Based on these results a donor risk score was constructed, however this score was not validated in the testing set (c1), nor were any of the individual component donor factors significantly associated with worse overall survival. In the second cohort (c2), only donor age was significantly associated with worse survival, and it validated in the independent test set from c2. Since donor age was significant in 3 of the 4 cohorts, we quantified the impact of donor age in the validation set of the most recent cohort, c2. We found that choosing a donor 2, 5, 10 or 20 years older was associated with a 1%, 2%, 3% or 7% decrease in 2 year OS, adjusted for patient characteristics. Conclusion. Despite data on over 10,000 URD transplants, we were unable to develop a valid donor selection score. The only donor characteristic associated with better survival was younger age, with 2-year survival being 3% better when a donor is 10 years younger. We did not test other endpoints; it is possible that separate scores could be generated to predict the risk of other outcomes (e.g. graft failure, graft-versus-host disease), however, unless the adverse donor characteristics are identical for these outcomes, centers will still have to prioritize the various donor characteristics to select from a pool of potential donors. This large data set shows that none of the other easily available donor clinical and genetic factors tested were reproducibly associated with survival and hence, flexibility in selecting URD based on these characteristics is justified. These data support a simplified URD selection process and have significant implications for URD registries. Disclosures Porter: Incyte: Honoraria; Genentech/Roche: Employment, Other: Family member employment, stock ownship - family member; Servier: Honoraria, Other: Travel reimbursement; Novartis: Honoraria, Patents & Royalties, Research Funding; Immunovative Therapies: Other: Member DSMB. Lee: Amgen: Other: One-time advisory board member; Bristol-Myers-Squibb: Other: One-time advisory board member; Mallinckrodt: Honoraria; Kadmon: Other: One-time advisory board member.


Author(s):  
Vijayalakshmi Kuttath ◽  
Harikumaran Nair ◽  
Muraleedharan Nair

Introduction: A crucial component of the effort to meet the growing demand for blood is the recruitment and retention of young novice blood donors. Reducing postdonation syncopal reactions could have a beneficial impact on donor convenience, safety, and desire to donate again. Aim: To evaluate the effectiveness of predonation hydration over standard blood donation in the prevention or decrease in severity of postdonation Vasovagal Reactions (VVR) in hydrated blood donors in comparison with the non hydrated group. Materials and Methods: The randomised controlled trial was conducted on 953 first time voluntary blood donors. Donors in the intervention arm drank 250 mL water 30 minutes before blood donation, while those in the control group did not receive any intervention. Blood was collected by standard protocol. Outcome, VVR, if present was graded as mild, moderate, and severe. Analysis of results were done using Statistical Package for the Social Sciences (SPSS) version 16.0. A sensitivity analysis was also done to consider the dropouts from the study. Results: A total of 900 participants were included in the study, of which 443 were controls and 457 were cases. An effect size of 6.1%, a Relative Risk (RR) of 0.54 {95% Confidence Interval (CI)=0.36-0.81} and a risk reduction of 45% was arrived at, pointing to a protective role for predonation hydration in preventing VVR. There was a significant reduction in the severity of VVR in the predonation hydration group compared to the standard blood donation group (p-value=0.002). The protective effect of hydration on decreasing the occurrence and severity of VVR had statistical support in males in the moderate and severe grades (p-value=0.017). A similar statistical significance was not established in females (p-value=0.173). Sensitivity analysis did not reveal a difference in the statistical significance of variables between compared groups. Conclusion: Predonation hydration was found to be effective in preventing and decreasing the severity of VVR in novice blood donors.


2012 ◽  
Vol 47 (1) ◽  
pp. 13-16 ◽  
Author(s):  
Farhad Razjou ◽  
Mahtab Maghsudlu ◽  
Soheila Nasizadeh ◽  
Maryam Zadsar

2021 ◽  
pp. 1-8
Author(s):  
Klara Greffin ◽  
Holger Muehlan ◽  
Samuel Tomczyk ◽  
Ariane Suemnig ◽  
Silke Schmidt ◽  
...  

<b><i>Introduction:</i></b> To maintain a sufficient donor pool, deferred first-time donors (FTD) should be motivated to return for blood donation. This pilot study investigates how deferral affects momentary mood, satisfaction with the donation process, and subsequent return behavior to examine their potential for motivating (deferred) FTD. <b><i>Methods:</i></b> All of the subjects (<i>n</i> = 96) completed a first questionnaire (A1) before pre-donation assessment. Deferred FTD (<i>n</i> = 22) were asked to complete a second questionnaire (A2) immediately after deferral, while non-deferred FTD (<i>n</i> = 74) filled in the second questionnaire (A3) after blood donation. The impact of deferral, momentary mood, and satisfaction with the donation process on return behavior within 12 months was tested by calculating two path analyses, controlling for sex and age. <b><i>Results:</i></b> Mood (<i>p</i> &#x3c; 0.001) and satisfaction with social aspects of the donation process (<i>p</i> = 0.01) were decreased after deferral. Deferred FTD were less likely than non-deferred FTD to return to the blood donation center within 12 months (60.8 vs. 36.4%; <i>p</i> = 0.043). However, path analyses revealed that deferral effects on mood and satisfaction were not connected to return behavior. Instead, age had a significant influence on return behavior (<i>p</i> &#x3c; 0.05) such that, overall, non-returning FTD were older than returning FTD, regardless of their deferral status. <b><i>Conclusion:</i></b> Our findings suggest that mood and satisfaction with the donation process are directly affected by deferral but not clearly responsible for low return rates. It seems promising to embed these variables in established health behavior models in further studies to increase the return rates of deferred FTD.


2020 ◽  
Author(s):  
Toshinori Chiba ◽  
Taiki Oka ◽  
Toshitaka Hamamura ◽  
Nao Kobayashi ◽  
Masaru Honjo ◽  
...  

SummaryBackgroundRising rates of suicide, the most dreadful consequence of mental health effects elicited by the coronavirus pandemic (COVID-19) are cause for grave concern. However, the exact association between mental health problems and suicide remains largely unknown in relation to COVID-19.MethodsTo determine the impact of COVID-19 on suicide trajectory, we used an interrupted time-series design to analyze monthly suicides rates extracted from Japan’s national database. We next used mixed-effects regression models to investigate the relationship between the nationwide suicide increase in August 2020 and psychiatric states of 4,348 individuals from an online survey performed immediately before (December 2019) and during (August 2020) the pandemic. Psychiatric states included depression, anxiety, and COVID-19-related PTSD, a form of severe event-related stress.FindingsIn Japan, suicides had gradually decreased before COVID-19 (β = −0·7×10−3, t57 = −14·2, p = 8·6×10−46), but increased drastically after a state of emergency was declared in April 2020 (β = 0·9×10−2, t57 = 17·3, p = 2·3×10−67). We found that PTSD symptoms reliably predict COVID-19’s impact on suicide rates (β = 6·3×10−4, t3936 = 5·96, p = 2·7×10−9). In contrast, depression scores are a reliable indicator of stress vulnerability (i.e. future suicide increases, β = 0·001, t3936 = 6·6, p = 4·5×10−11). Simulations revealed that a one-point reduction in PTSD score could decrease suicides by up to 3·1 per ten million people per month in Japan.InterpretationPTSD symptoms may help to identify high-risk groups so as to increase efficacy of prevention policies.FundingKDDI collaborative research contract, the Innovative Science and Technology Initiative for Security (JPJ004596), ATLA and AMED (JP20dm0307008).Research in contextEvidence before this studyWe searched PubMed on December 2, 2020, for “COVID” and “suicid*” in the titles or abstracts of published articles and obtained 269 hits. No language restrictions were applied to the search. Nearly all previous articles on suicide and COVID-19 have reported simulation studies of suicide counts and rates in case studies, editorials, letters, and commentaries. To date, no study has analyzed the association between psychiatric states and suicide increases in the context of the COVID-19 pandemic.Added value of this studyTo the best of our knowledge, this is the first study reporting a concrete approach to predict suicide rate increases from psychiatric states during the COVID-19 pandemic. Our findings indicate that PTSD symptoms are a reliable surrogate endpoint of pandemic-related suicide increase.Implications of all available evidenceThis work provides a new perspective on preparing guidelines for suicide prevention. Efforts should focus on reducing PTSD severity for single individuals and populations to reduce the overall suicide risk.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 8549-8549
Author(s):  
Sonali Sethi ◽  
Scott Oh ◽  
Alexander Chen ◽  
Christina Bellinger ◽  
Lori Lofaro ◽  
...  

8549 Background: Current guidelines recommend that patients who have lung nodules with high risk of malignancy (ROM) ( > 65%) should undergo surgical and other ablative therapies. However, prior studies have shown that clinicians may opt for more conservative management in these high-risk patients. Percepta Genomic Sequencing Classifier (GSC), a RNA-seq based classifier derived from bronchial epithelial cells to assess risk of lung cancer, was designed to risk stratify lung nodules by both down classifying ROM as a “rule -out“ test with high sensitivity as well as up-classifying ROM as a “rule- in” test with high specificity for malignancy. This study assesses the impact of up-classification of high ROM to very high- risk (ROM > 90%) by Percepta GSC in increasing the number of ablative therapies recommended for high-risk lung nodules. Methods: This prospective randomized decision impact survey included 37 patients from the AEGIS I/ II cohorts and the Percepta Registry who were undergoing work up of a lung nodule and had a high ROM that was up-classified to very high ROM by Percepta GSC. 97 physicians assessed 10 randomly assigned patient cases. They then responded to a survey designed to test the hypothesis that including a Percepta GSC result will increase the recommendation for surgical or other ablative therapy in very high- risk patients as well as their level of confidence of this recommendation. Physicians were first presented with the patient’s clinical information without Percepta GSC and then with Percepta GSC. Results: 97 physicians provided a total of 682 evaluations of 37 patients. In this study, the recommendation for surgical or other ablative therapy increased from 19/341 (5.6%) prior to the Percepta GSC result to 157/341 (46%) after the Percepta GSC result (odds ratio of 4.76, p-value < 0.001). The number of extremely confident recommendations increased from 72/341 (21%) without Percepta GSC to 106/341 (31%) with Percepta GSC. Significantly more physicians had increased confidence in their recommended next step post-Percepta GSC when collapsing the confidence level responses into increased confidence (n = 93) and decreased confidence (n = 44) (p-value = 0.002). Conclusions: Percepta GSC had a quantifiable impact on clinical decision making. It increased the number of surgical and other ablative therapies recommended when patients were re-classified from high to very high- risk of lung cancer with a higher confidence in the recommended next step. By up-classifying nodules from high to very high ROM, Percepta GSC will improve the likelihood and timeliness of appropriate therapies and assist clinicians more effectively manage patients to improve patient outcomes.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 2910-2910
Author(s):  
Mark N. Polizzotto ◽  
Erica M. Wood ◽  
Helen Ingham ◽  
Anthony J. Keller

Abstract Selection of voluntary donors who are at low risk of transfusion-transmissible viral infection (TTVI) is central in maintaining the safety of the blood supply. Evaluation of the effectiveness of donor selection and the dynamics of the process may offer opportunities to further improve transfusion safety. We analysed the impact of donor selection on the relative prevalence of TTVIs in all allogeneic donations in Australia between July 2000 and June 2006. We further explored the donor selection process where donors were found to have a TTVI despite pre-donation screening. Donors repeat reactive for a TTVI were offered counselling and confidential interview where potential infective risk exposures were reassessed, and disclosure of risk exposures at initial screening re-evaluated. 6,274,144 donations were received during the study period and tested for HCV, HBV, HIV, and HTLVI/II; of these, 1449 (0.02%) were repeat reactive for at least one TTVI and were discarded. Twenty-nice (2.5%) positive donors were not contactable or declined interview, giving an interview participation rate of 98.5%; all 1449 positive donors are included in the prevalence analysis. This comprised 605 (42%) positive for Hepatitis B; 818 (56%) positive for Hepatitis C; 18 (1%) positive for HIV; and 20 (1%) positive for HTLVI/II. The prevalence of HBV in accepted donors was at least 50 times lower than that in the Australian population; for HCV, 75 times lower; and HIV for 350 times lower. In new donors the prevalence was at least 6 times lower for HBV, 12 times lower for HCV and 140 times lower for HIV. In 1158 of 1420 donors interviewed (80%) an infective risk was identified; 509 donors (44%) had more than one risk. The most common identified were country of birth and parental ethnicity (N=682, 26% of reported risks); tattoos/piercings (N=448, 18%); and intravenous drug use (N=302, 12%). Other common risks included surgery or endoscopy (201 donors, 8%); receipt of blood products (N=144, 6%); and other blood contact, such as following sporting injuries (N=232, 10%). High-risk sexual contacts were uncommon risk exposures, but disproportionately significant in donors with HIV. Many of the identified risk exposures were temporally remote. The relative importance of risks varied significantly between TTVIs. In 302 cases (21%) disclosure of the identified risk exposures at pre-donation screening would have resulted in donor deferral. The proportion of positive donations which would not have been accepted had exposures been reported accurately was 3% for HBV; 35% for HCV; 39% for HIV; and 5% for HTLVI/II. Factors influencing non-disclosure included the temporal remoteness or isolated nature of the exposure, belief behaviour was not high-risk (eg, that needles were not shared during drug use), and perceptions that laboratory testing rendered disclosure unnecessary. Concerns about privacy or confidentiality of personal information were uncommon. These findings affirm the effectiveness of current stringent donor selection criteria in reducing the residual risk of TTVI. Ongoing donor education regarding the importance of risk disclosure is required. The development of screening criteria for use with emerging infections also offers continued opportunity for further improvements in transfusion safety.


Sign in / Sign up

Export Citation Format

Share Document