scholarly journals Options for possible changes to the blood donation service: health economics modelling

2018 ◽  
Vol 6 (40) ◽  
pp. 1-162 ◽  
Author(s):  
Richard Grieve ◽  
Sarah Willis ◽  
Kaat De Corte ◽  
M Zia Sadique ◽  
Neil Hawkins ◽  
...  

BackgroundEvidence is required on the cost-effectiveness of alternative changes to the blood collection service.Objectives(1) To estimate the cost-effectiveness of alternative minimum interdonation intervals between whole-blood donations. (2) To investigate donors’ frequency of whole-blood donation according to alternative changes to the blood collection service. (3) To estimate the cost-effectiveness of alternative strategies for maintaining the supply of whole blood.MethodsWe undertook a within-trial cost-effectiveness analysis (CEA) of the INTERVAL trial, stated preference (SP) surveys to elicit donor preferences and a CEA of different strategies for blood collection. The strategies considered were reduced minimum intervals between whole-blood donations, introduction of a donor health report and changes to appointment availability and opening times at blood collection venues. The within-trial CEA included 44,863 donors, with men randomly assigned to 12- versus 10- versus 8-week interdonation intervals, and women to 16- versus 14- versus 12-week interdonation intervals. We undertook a SP survey of non-INTERVAL donors (100,000 invitees). We asked donors to state the frequency with which they would be willing to donate blood, according to the service attribute and level. The CEA compared changes to the blood service with current practice by combining the survey estimates with information from the NHS Blood and Transpant database (PULSE) and cost data. The target population was existing whole-blood donors in England, of whom approximately 85% currently donate whole blood at mobile (temporary) blood collection venues, with the remainder donating at static (permanent) blood collection centres. We reported the effects of the alternative strategies on the number of whole-blood donations, costs and cost-effectiveness.ResultsThe reduced donation interval strategies had higher deferral rates caused by low haemoglobin (Hb), but increased frequency of successful donation. For men in the 8- versus 12-week arm of the INTERVAL trial [Di Angelantonio E, Thompson SG, Kaptoge S, Moore C, Walker M, Armitage J,et al.Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors.Lancet2017;390:2360–71], the Hb-related deferral rate was 5.7% per session versus 2.6% per session, but the average number of donations over 2 years increased by 1.71 (95% confidence interval 1.60 to 1.80). A total of 25,187 (25%) donors responded to the SP survey. For static donor centres, extending appointment availability to weekday evenings or weekends, or reduced intervals between blood donations, increased stated donation frequency by, on average, 0.5 donations per year. The CEA found that reducing the minimum interval, extending opening times to weekday evenings and extending opening times to weekends in all static donor centres would provide additional whole blood at a cost per additional unit of £10, £23 and £29, respectively, with similar results for donors with high-demand blood types.LimitationsThe study did not consider the long-term rates at which donors will leave the donation register, for example following higher rates of Hb-related deferral.ConclusionsExtending opening hours for blood donation to weekday evenings or weekends for all static donor centres are cost-effective ways of increasing the supply of high-demand blood types.Future workTo monitor the effects of new strategies on long-term donation frequency.FundingThe National Institute for Health Research Health Services and Delivery Research programme.

Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 4739-4739
Author(s):  
Komal Arora ◽  
Fernando Martinez ◽  
Blanca N Keneson ◽  
Benjamin Lichtiger

Abstract Introduction Preoperative autologous donation prior to bone marrow harvest is commonly performed at our hospital. We usually collect 2 units of autologous blood from the bone marrow donors prior to the harvest procedure. In order to save an additional trip to the donor center, we sometimes collect double red cell by apheresis procedure. The decision for doing double red cell collection is based on the donor's hemoglobin and willingness to undergo the procedure. We reviewed all the autologous collections performed at our hospital based blood bank over a period of 2 ½ years. The utilization rate of autologous units, cost effectiveness and donor adverse events were evaluated. Methods: A computerized search was performed for all the autologous donations performed between January 2013 and July 2015. The charts of only the donors undergoing bone marrow harvest were reviewed. All other indications for preoperative autologous blood donation were excluded. The pre-donation hemoglobin (HB), time from donation to day of surgery, and any adverse events during autologous donation were recorded. The preoperative HB values (usually 1- 3 days before surgery) and the amount of marrow harvested were also noted. The utilization of the collected units was evaluated by reviewing all the units that were transfused and those that expired on shelf. The cost effectiveness of all the autologous collections was evaluated. Results: A total of 262 autologous units were collected from 137 donors in age range 9-72 years (M:F = 1.3:1). Double red cell collection by apheresis was performed on 49 of 137 (36 %) donors in one visit. The remaining 88 donors underwent autologous collection of 1 to 2 units of whole blood. Five donors donated more than 2 autologous units throughout this time period. Among the 49 double red cell donors, only 3 were females and remaining 46 were males. Among the whole blood donors, 56 were females and 32 were male. The mean baseline HB prior to autologous donations was 12.9 g/dL (range 11 to 15) in female donors and 15.2 (range 12.5 -17.8) in male donors. The average amount of marrow harvested was 1232 cc (range 480 - 2000 cc). The average fall in HB (measured as the difference between the baseline HB and the HB prior to surgery) was 2.07 (range 0 to 4.6). The HB loss resulting from the procedure was not available since most of the donors undergoing marrow harvest were transfused in the operating room at the end of the procedure. Out of the 102 units collected by double red cell apheresis, 39 units (38 %) expired on shelf. Among the 160 whole blood autologous units, 25 units (16 %) expired on shelf. The overall utilization rate of all the autologous units was 75.5%. The marrow harvest was either cancelled or rescheduled in 15 donors. Three donors received allogeneic blood post operatively in addition to the autologous unit. The pre harvest HB was 10 or above in 93% of the donors. All the donors tolerated the marrow harvest procedure well. No severe adverse events were noted during any of the autologous donations. Conclusion: The instituitional trigger for transfusing blood is 9.0 g/dL and 93% of our donors had a pre-operative HB above 10 g/dL. Hence we conclude that these healthy donors can easily undergo the marrow harvest procedure without requiring any transfusion. The pre-operative autologous donations are unnecessary, time and resource consuming, costly for the donor and can be detrimental to donor's health by potentially exposing them overall to any type of blood. Disclosures No relevant conflicts of interest to declare.


Transfusion ◽  
2003 ◽  
Vol 43 (6) ◽  
pp. 721-729 ◽  
Author(s):  
B.R. Jackson ◽  
M.P. Busch ◽  
S.L. Stramer ◽  
J.P. AuBuchon

2009 ◽  
Vol 29 (6) ◽  
pp. 678-689 ◽  
Author(s):  
Matt D. Stevenson ◽  
Jeremy E. Oakley ◽  
Myfawny Lloyd Jones ◽  
Alan Brennan ◽  
Juliet E. Compston ◽  
...  

Purpose. Five years of bisphosphonate treatment have proven efficacy in reducing fractures. Concerns exist that long-term bisphosphonate treatment may actually result in an increased number of fractures. This study evaluates, in the context of England and Wales, whether it is cost-effective to conduct a randomized controlled trial (RCT) and what sample size may be optimal to estimate the efficacy of bisphosphonates in fracture prevention beyond 5 years. Method. An osteoporosis model was constructed to evaluate the cost-effectiveness of extending bisphosphonate treatment from 5 years to 10 years. Two scenarios were run. The 1st uses long-term efficacy data from published literature, and the 2nd uses distributions elicited from clinical experts. Results of a proposed RCT were simulated. The expected value of sample information technique was applied to calculate the expected net benefit of sampling from conducting such an RCT at varying levels of participants per arm and to compare this with proposed trial costs. Results. Without further information, the better duration of bisphosphonate treatment was estimated to be 5 years using the published data but 10 years using the elicited expert opinions, although in both cases uncertainty was substantial. The net benefit of sampling was consistently high when between 2000 and 5000 participants per arm were recruited. Conclusions. An RCT to evaluate the long-term efficacy of bisphosphonates in fracture prevention appears to be cost-effective for informing decision making in England and Wales.


2019 ◽  
Vol 55 (5) ◽  
pp. 292-305
Author(s):  
Shazia Jamshed ◽  
Akshaya Srikanth Bhagavathula ◽  
Sheikh Muhammad Zeeshan Qadar ◽  
Umaira Alauddin ◽  
Sana Shamim ◽  
...  

Background: Gastroesophageal reflux disease (GERD) is a common gastrointestinal disorder that results from regurgitation of acid from the stomach into the esophagus. Treatment available for GERD includes lifestyle changes, antacids, histamine-2 receptor antagonists (H2RAs), proton pump inhibitors (PPIs), and anti-reflux surgery. Aim: The aim of this review is to assess the cost-effectiveness of the use of PPIs in the long-term management of patients with GERD. Method: We searched in PubMed to identify related original articles with close consideration based on inclusion and exclusion criteria to choose the best studies for this narrative review. The first section compares the cost-effectiveness of PPIs with H2RAs in long-term heartburn management. The other sections shall only discuss the cost-effectiveness of PPIs in 5 different strategies, namely, continuous (step-up, step-down, and maintenance), on-demand, and intermittent therapies. Results: Of 55 articles published, 10 studies published from 2000 to 2015 were included. Overall, PPIs are more effective in relieving heartburn in comparison with ranitidine. The use of PPIs in managing heartburn in long-term consumption of nonsteroidal anti-inflammatory drug (NSAID) has higher cost compared with H2RA. However, if the decision-maker is willing to pay more than US$174 788.60 per extra quality-adjusted life year (QALY), then the optimal strategy is traditional NSAID (tNSAID) and PPIs. The probability of being cost-effective was also highest for NSAID and PPI co-therapy users. On-demand PPI treatment strategy showed dominant with an incremental cost-effectiveness ratio of US$2197 per QALY gained and was most effective and cost saving compared with all the other treatments. The average cost-effectiveness ratio was lower for rabeprazole therapy than for ranitidine therapy. Conclusion: Our review revealed that long-term treatment with PPIs is effective but costly. To achieve long-term cost-effective approach, we recommend on-demand approach to treat heartburn symptoms, but if the symptoms persist, treatment with continuous step-down therapy should be applied.


Obesity Facts ◽  
2020 ◽  
Vol 13 (5) ◽  
pp. 487-498
Author(s):  
Ewa Bandurska ◽  
Michał Brzeziński ◽  
Paulina Metelska ◽  
Marzena Zarzeczna-Baran

<b><i>Background:</i></b> Obesity and overweight, including childhood obesity and overweight, pose a public health challenge worldwide. According to the available research findings, long-term interventions focusing on dietary behavior, physical activity, and psychological support are the most effective in reducing obesity in children aged 6–18 years. There are limited studies showing the financial effectiveness of such interventions. <b><i>Objective:</i></b> The objective of the present study was to evaluate cost-effectiveness of the 6-10-14 for Health weight management program using pharmacoeconomic indicators, i.e., cost-effectiveness analysis using the incremental cost-effectiveness ratio. <b><i>Methods:</i></b> We used anthropometric data of 3,081 children included in a 1-year-long intervention with a full financial cost assessment. <b><i>Results:</i></b> The cost of removing a child from the overweight group (BMI &#x3e;85th percentile) was PLN 27,758 (EUR 6,463), and the cost of removing a child from the obese group (BMI &#x3e;95th percentile) was slightly lower, i.e., PLN 23,601 (EUR 5,495). Given the obesity-related medical costs calculated in the life-long perspective, these results can be considered encouraging. At the same time, when comparing the total costs per participant with the costs of other interventions, it can be noted that they are similar to the costs of school programs containing more than 1 type of intervention. <b><i>Conclusions:</i></b> The 6-10-14 for Health program can be considered cost-effective. As a result of committing financial resources in the approximate amount of EUR 1,790 per child, around half of the children participating in the program have improved their weight indicators.


2019 ◽  
Vol 40 (1) ◽  
pp. 323
Author(s):  
Marcos Aurelio Lopes ◽  
Flavio De Moraes ◽  
Francisval Melo Carvalho ◽  
Fabio Raphael Pascotti Bruhn ◽  
Andre Luis Ribeiro Lima ◽  
...  

This study aimed to analyze the effect of each workforce type on the cost-effectiveness of 20 dairy farms participating in the “Full Bucket” program, from January to December 2011, in the State of Rio de Janeiro. A stepwise multiple linear regression was used to identify the production cost components that most affected net margin, profitability, and cost-effectiveness. Workforce type influenced both profitability and cost-effectiveness, as well as total production cost. Economic analysis showed that farms with a hired workforce had the lowest total unit costs and a positive result. This way, the activity is able to produce in the long term and farmers are capitalizing. The farms that adopted mixed and family workforce had a positive net margin and a negative result, obtaining conditions to produce in the medium term. The highest representativeness on the items of effective operating cost in the family workforce stratum, in a descending order, were food, miscellaneous expenses, and energy. The most representative items in the mixed and hired workforce strata were food, workforce, and miscellaneous expenses.


Author(s):  
Marijke Keus Van De Poll ◽  
Gunnar Bergström ◽  
Irene Jensen ◽  
Lotta Nybergh ◽  
Lydia Kwak ◽  
...  

The cost-benefit and cost-effectiveness of a work-directed intervention implemented by the occupational health service (OHS) for employees with common mental disorders (CMD) or stress related problems at work were investigated. The economic evaluation was conducted in a two-armed clustered RCT. Employees received either a problem-solving based intervention (PSI; n = 41) or care as usual (CAU; n = 59). Both were work-directed interventions. Data regarding sickness absence and production loss at work was gathered during a one-year follow-up. Bootstrap techniques were used to conduct a Cost-Benefit Analysis (CBA) and a Cost-Effectiveness Analysis (CEA) from both an employer and societal perspective. Intervention costs were lower for PSI than CAU. Costs for long-term sickness absence were higher for CAU, whereas costs for short-term sickness absence and production loss at work were higher for PSI. Mainly due to these costs, PSI was not cost-effective from the employer’s perspective. However, PSI was cost-beneficial from a societal perspective. CEA showed that a one-day reduction of long-term sickness absence costed on average €101 for PSI, a cost that primarily was borne by the employer. PSI reduced the socio-economic burden compared to CAU and could be recommended to policy makers. However, reduced long-term sickness absence, i.e., increased work attendance, was accompanied by employees perceiving higher levels of production loss at work and thus increased the cost for employers. This partly explains why an effective intervention was not cost-effective from the employer’s perspective. Hence, additional adjustments and/or support at the workplace might be needed for reducing the loss of production at work.


Sign in / Sign up

Export Citation Format

Share Document