Higher Tenofovir Concentrations in Hair Are Associated with Decreases in Viral Load and Not Self-Reported Adherence in HIV-Infected Adolescents with Second-Line Virological Treatment Failure

Author(s):  
Tariro Chawana ◽  
Charles Nhachi ◽  
Kusum Nathoo ◽  
Bernard Ngara ◽  
Hideaki Okochi ◽  
...  
2020 ◽  
Author(s):  
Phionah Kibalama Ssemambo ◽  
Mary Gorrethy Nalubega-Mboowa ◽  
Arthur H. Owora ◽  
Robert Serunjogi ◽  
Susan Kizito Kironde ◽  
...  

Abstract Background: Many HIV-infected African children gained access to antiretroviral treatment (ART) through expansion of PEPFAR programs since 2004 and introduction of “Test and Treat” WHO guidelines in 2015. As ART access increases and children transition from adolescence to adulthood, treatment failure is inevitable. Viral load (VL) monitoring in Uganda was introduced in 2016 replacing clinical monitoring. However, there’s limited data on the comparative effectiveness of these two strategies among HIV-infected children in resource-limited settings (RLS).Methods: HIV-infected Ugandan children aged 1-12 years from HIV-care programs with >1 year of first-line ART using only immunologic and clinical criteria to monitor response to treatment were screened in 2010. Eligible children were stratified by VL ≤ 400 and >400 copies/ml randomized to clinical and immunological (control) versus clinical, immunological and VL monitoring to determine treatment failure with follow-up at 12, 24, 36, and 48 weeks. Plasma VL was analyzed retrospectively for controls. Mixed-effects logistic regression models were used to compare the prevalence of viral suppression between study arms and identify factors associated with viral suppression. Results: At baseline all children (n=142) were on NNRTI based ART (75% Nevirapine, 25% efavirenz). One third of ART-experienced children had detectable VL at baseline despite high CD4%. Median age was 6 years (interquartile range [IQR]: 5-9) and 43% were female. Overall, the odds of viral suppression were not different between study arms: (arm by week interaction, p=0.63), adjusted odds ratio [aOR]: 1.07; 95%CI: 0.53, 2.17, p=0.57) and did not change over time (aOR: 0 vs 24 week: 1.15; 95% CI: 0.91, 1.46, p=0.24 and 0 vs 48 weeks: 1.26; 95%CI: 0.92, 1.74, p=0.15). Longer duration of a child’s ART exposure was associated with lower odds of viral suppression (aOR: 0.61; 95% CI: 0.42, 0.87, p<.01). Only 13% (9/71) of children with virologic failure were switched to second-line ART, in spite of access to real-time VL.Conclusion: With increasing ART exposure, viral load monitoring is critical for early detection of treatment failure in RLS. Clinicians need to make timely informed decisions to switch failing children to second-line ART. Trial Registration: ClinicalTrials.gov NCT04489953, 28 Jul 2020. Retrospectively registered.


2019 ◽  
Vol 16 (1) ◽  
Author(s):  
B. Castelnuovo ◽  
F. Mubiru ◽  
I. Kalule ◽  
A. Kiragga

Abstract Background During the initial scale up of ART in sub-Saharan Africa, prescribed regimens included drugs with high potential for toxicity (particularly stavudine). More recently a growing number of patients requires second line treatment due to treatment failure, especially following the expansion of viral load testing. We aim to determine the reasons and risk factors for modification of first line ART across the years. Methods We included patients started on standard first line ART (2NRTI + 1 NNRTI) between 2005 and 2016 at the Infectious Diseases Institute, Kampala, Uganda. We described the reasons for treatment modification categorized in (1) toxicity (2) treatment failure (3) other reason (new TB treatment, new pregnancy). We used Cox proportional hazard to identify factors associated with treatment modification due to toxicity. Results We included 14,261 patients; 9114 (63.9%), were female, the median age was 34 years (IQR: 29–40), 60.8% were in WHO stage 3 and 4. The median BMI and CD4 count were 21.9 (IQR: 19.6–24.8) and 188 cell/µL (IQR: 65–353) respectively; 27.5% were started on stavudine, 46% on zidovudine, and 26.5% on a tenofovir containing regimens. We observed 6248 ART modifications in 4868/14,261 patients (34.1%); 1615 were due to toxicity, 1077 to treatment failure, 1330 to contraindications, and 1860 patients following WHO recommendation of phasing out stavudine and substituting with another NRTI. Modification for drug toxicity declined rapidly after the phase out of stavudine (2008), while switches to second line regimes increased after the implementation of viral load monitoring (2015). Patients with normal BMI compared to underweight, (HR: 0.79, CI 0.69–0.91), with CD4 counts 200–350 cells/µL compared to < 200 cells/µL (HR: 0.81− CI 0.71–0.93), and started on zidovudine (HR: 0.51 CI 0.44–0.59) and tenofovir (HR: 0.16, CI 0.14–0.22) compared to stavudine were less likely to have ART modification due to toxicity. Older patients (HR: 1.14 per 5-year increase CI 1.11–1.18), those in WHO stage 3 and 4 (HR: 1.19, CI 1.06–1.34) were more likely to have ART modification due to toxicity. Conclusions Toxicity as reason for drugs substitution decreased over time mirroring the phase out of stavudine, while viral load expansion identified more patients in need of second line treatment.


2020 ◽  
Author(s):  
Phionah Kibalama Ssemambo ◽  
Mary Gorrethy Nalubega-Mboowa ◽  
Arthur H. Owora ◽  
Robert Serunjogi ◽  
Susan Kizito Kironde ◽  
...  

Abstract Background: Many HIV-infected African children gained access to antiretroviral treatment (ART) through expansion of PEPFAR programs since 2004 and introduction of “Test and Treat” WHO guidelines in 2015. As ART access increases and children transition from adolescence to adulthood, treatment failure is inevitable. Viral load (VL) monitoring in Uganda was introduced in 2016 replacing clinical monitoring. However, there’s limited data on the comparative effectiveness of these two strategies among HIV-infected children in resource-limited settings (RLS).Methods: HIV-infected Ugandan children aged 1-12 years from HIV-care programs with >1 year of first-line ART using only immunologic and clinical criteria to monitor response to treatment were screened in 2010. Eligible children were stratified by VL ≤ 400 and >400 copies/ml randomized to clinical and immunological (control) versus clinical, immunological and VL monitoring to determine treatment failure with follow-up at 12, 24, 36, and 48 weeks. Plasma VL was analyzed retrospectively for controls. Mixed-effects logistic regression models were used to compare the prevalence of viral suppression between study arms and identify factors associated with viral suppression. Results: At baseline all children (n=142) were on NNRTI based ART (75% Nevirapine, 25% efavirenz). One third of ART-experienced children had detectable VL at baseline despite high CD4%. Median age was 6 years (interquartile range [IQR]: 5-9) and 43% were female. Overall, the odds of viral suppression were not different between study arms: (arm by week interaction, p=0.63), adjusted odds ratio [aOR]: 1.07; 95%CI: 0.53, 2.17, p=0.57) and did not change over time (aOR: 0 vs 24 week: 1.15; 95% CI: 0.91, 1.46, p=0.24 and 0 vs 48 weeks: 1.26; 95%CI: 0.92, 1.74, p=0.15). Longer duration of a child’s ART exposure was associated with lower odds of viral suppression (aOR: 0.61; 95% CI: 0.42, 0.87, p<.01). Only 13% (9/71) of children with virologic failure were switched to second-line ART, in spite of access to real-time VL.Conclusion: With increasing ART exposure, viral load monitoring is critical for early detection of treatment failure in RLS. Clinicians need to make timely informed decisions to switch failing children to second-line ART. Trial Registration: ClinicalTrials.gov NCT04489953, 28 Jul 2020. Retrospectively registered.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Phionah Kibalama Ssemambo ◽  
Mary Gorrethy Nalubega-Mboowa ◽  
Arthur Owora ◽  
Robert Serunjogi ◽  
Susan Kironde ◽  
...  

Abstract Background Many HIV-infected African children gained access to antiretroviral treatment (ART) through expansion of PEPFAR programs since 2004 and introduction of “Test and Treat” WHO guidelines in 2015. As ART access increases and children transition from adolescence to adulthood, treatment failure is inevitable. Viral load (VL) monitoring in Uganda was introduced in 2016 replacing clinical monitoring. However, there’s limited data on the comparative effectiveness of these two strategies among HIV-infected children in resource-limited settings (RLS). Methods HIV-infected Ugandan children aged 1–12 years from HIV-care programs with > 1 year of first-line ART using only immunologic and clinical criteria to monitor response to treatment were screened in 2010. Eligible children were stratified by VL ≤ 400 and > 400 copies/ml randomized to clinical and immunological (control) versus clinical, immunological and VL monitoring to determine treatment failure with follow-up at 12, 24, 36, and 48 weeks. Plasma VL was analyzed retrospectively for controls. Mixed-effects logistic regression models were used to compare the prevalence of viral suppression between study arms and identify factors associated with viral suppression. Results At baseline all children (n = 142) were on NNRTI based ART (75% Nevirapine, 25% efavirenz). One third of ART-experienced children had detectable VL at baseline despite high CD4%. Median age was 6 years (interquartile range [IQR]: 5–9) and 43% were female. Overall, the odds of viral suppression were not different between study arms: (arm by week interaction, p = 0.63), adjusted odds ratio [aOR]: 1.07; 95%CI: 0.53, 2.17, p = 0.57) and did not change over time (aOR: 0 vs 24 week: 1.15; 95% CI: 0.91, 1.46, p = 0.24 and 0 vs 48 weeks: 1.26; 95%CI: 0.92, 1.74, p = 0.15). Longer duration of a child’s ART exposure was associated with lower odds of viral suppression (aOR: 0.61; 95% CI: 0.42, 0.87, p < .01). Only 13% (9/71) of children with virologic failure were switched to second-line ART, in spite of access to real-time VL. Conclusion With increasing ART exposure, viral load monitoring is critical for early detection of treatment failure in RLS. Clinicians need to make timely informed decisions to switch failing children to second-line ART. Trial registration ClinicalTrials.gov NCT04489953, 28 Jul 2020. Retrospectively registered. (https://register.clinicaltrials.gov).


2018 ◽  
Vol 25 (6) ◽  
pp. 1374-1380 ◽  
Author(s):  
M Alexandra Schickli ◽  
Michael J Berger ◽  
Maryam Lustberg ◽  
Marilly Palettas ◽  
Craig A Vargo

Purpose The management of endocrine therapy resistance is one of the most challenging facets of advanced breast cancer treatment. Palbociclib is an inhibitor of cyclin-dependent kinases 4 and 6 approved for the treatment of hormone receptor-positive, human epidermal growth factor receptor 2-negative advanced or metastatic breast cancer in combination with fulvestrant in postmenopausal women with disease progression following endocrine therapy. However, treatment responsiveness of tumors to palbociclib after multiple lines of endocrine therapy is not clearly established. The purpose of this study was to determine the efficacy of palbociclib and letrozole in patients pretreated with one or more lines of endocrine therapy. Methods This was a single-center, retrospective cohort study of all postmenopausal hormone receptor-positive, human epidermal growth factor receptor 2-negative metastatic breast cancer patients who received palbociclib and letrozole as a second-line endocrine therapy or beyond (and no prior cyclin-dependent kinases 4 and 6 inhibitor therapy) between February 1, 2015, and July 31, 2016. The primary objective was to evaluate time to treatment failure of palbociclib in combination with letrozole as a second-line of therapy or beyond. Results Fifty-three patients meeting eligibility criteria were included in the analysis. For the primary outcome, the median time to treatment failure of palbociclib and letrozole was 6.3 months (95% CI 3.1–7.4 months). Progression-free survival of palbociclib and letrozole therapy was 6.4 months (95% CI 4.9–8.3 months). Conclusions Palbociclib and letrozole therapy is a viable, effective treatment option for metastatic breast cancer patients who were not exposed to cyclin-dependent kinases 4 and 6 inhibitors as a first-line endocrine therapy. The benefits of palbociclib and letrozole therapy were seen without excessive toxicity, and although neutropenia was common, it may be managed with dose reduction.


2019 ◽  
Author(s):  
Joseph B. Babigumira ◽  
Solomon J. Lubinga ◽  
Mindy M. Cheng ◽  
James K. Karichu ◽  
Louis P. Garrison

Abstract Background HIV viral load (VL) monitoring informs antiretroviral therapy failure and helps to guide regimen changes. Typically, VL monitoring is performed using dried blood spot (DBS) samples transported and tested in a centralized laboratory. Novel sample collection technologies based on dried plasma stored on a plasma separation card (PSC) have become available. The cost-effectiveness of these different testing approaches to monitor VL is uncertain, especially in resource-limited settings. The objective of this study is to evaluate the potential cost-effectiveness of HIV VL testing approaches with PSC samples compared to DBS samples in Malawi. Methods We developed a decision-tree model to evaluate the cost-effectiveness of two different sample collection and testing methods—DBS and PSC samples transported and tested at central laboratories. The analysis used data from the published literature and was performed from the Malawi Ministry of Health perspective. We estimated costs of sample collection, transportation, and testing. The primary clinical outcome was test accuracy (proportion of patients correctly classified with or without treatment failure). Sensitivity analysis was performed to assess the robustness of results. Results The estimated test accuracy for a DBS testing approach was 87.5% compared to 97.4% for an approach with PSC. The estimated total cost per patient of a DBS testing approach was $19.39 compared to $17.73 for a PSC approach. Based on this, a PSC-based testing approach “dominates” a DBS-based testing approach (i.e., lower cost and higher accuracy). Conclusion The base-case analysis shows that a testing approach using PSC sample is less costly and more accurate (correctly classifies more patients with or without treatment failure) than with a DBS approach. Our study suggests that a PSC testing approach is likely an optimal strategy for routine HIV VL monitoring in Malawi. However, given the limited data regarding sample viability, additional real-world data are needed to validate the results.


2020 ◽  
Vol 5 (3) ◽  
pp. 140
Author(s):  
Sai Soe Thu Ya ◽  
Anthony D. Harries ◽  
Khin Thet Wai ◽  
Nang Thu Thu Kyaw ◽  
Thet Ko Aung ◽  
...  

Myanmar has introduced routine viral load (VL) testing for people living with HIV (PLHIV) starting first-line antiretroviral therapy (ART). The first VL test was initially scheduled at 12-months and one year later this changed to 6-months. Using routinely collected secondary data, we assessed program performance of routine VL testing at 12-months and 6-months in PLHIV starting ART in the Integrated HIV-Care Program, Myanmar, from January 2016 to December 2017. There were 7153 PLHIV scheduled for VL testing at 12-months and 1976 scheduled for VL testing at 6-months. Among those eligible for testing, the first VL test was performed in 3476 (51%) of the 12-month cohort and 952 (50%) of the 6-month cohort. In the 12-month cohort, 10% had VL > 1000 copies/mL, 79% had repeat VL tests, 42% had repeat VL > 1000 copies/mL (virologic failure) and 85% were switched to second-line ART. In the 6-month cohort, 11% had VL > 1000 copies/mL, 83% had repeat VL tests, 26% had repeat VL > 1000 copies/mL (virologic failure) and 39% were switched to second-line ART. In conclusion, half of PLHIV initiated on ART had VL testing as scheduled at 12-months or 6-months, but fewer PLHIV in the 6-month cohort were diagnosed with virologic failure and switched to second-line ART. Programmatic implications are discussed.


Sign in / Sign up

Export Citation Format

Share Document