Predialytic versus Intradialytic Nutrition: A Study to Assess Effects on Intradialytic Blood Pressure, Dialysis Adequacy, and Urea Removal

2021 ◽  
pp. 1-6
Author(s):  
Namrata S. Rao ◽  
Abhilash Chandra ◽  
Sai Saran ◽  
Manish Raj Kulshreshta ◽  
Prabhakar Mishra ◽  
...  

<b><i>Background:</i></b> Provision of oral protein in hemodialysis (HD) is desirable due to improved compliance to protein requirements and better nutritional status, but the risks of hypotension and underdialysis need to be considered. This study compared 2 different timings for administering oral nutritional supplements (ONS), predialysis and mid-dialysis, with respect to hemodynamics, dialysis adequacy, urea removal, and tolerability. <b><i>Methods:</i></b> This single-center, prospective crossover study analyzed 72 stable patients with ESRD on twice a week maintenance HD with a mean age of 38.7 (±11.2) years and a dialysis vintage of 28.2 (±13.1) months. In the first week, all the patients received ONS (450 kcal energy, 20 g protein) 1 h prior to start of dialysis (group 1) and in the next week, the supplement was administered after 2 h of start of dialysis (group 2), with a predialysis fasting period of at least 3 h in both groups. Blood pressures, serum, and spent dialysate samples were collected and nausea occurrence was noted by severity. <b><i>Results:</i></b> Predialytic intake (group 1) was associated with higher predialysis and 1st hour blood urea, dialysis adequacy, and urea removal than group 2. Both groups achieved mean Kt/V &#x3e; 1.2, and the occurrence of symptomatic hypotensive episodes and nausea was not significantly different between the groups. On repeated measures ANOVA, changes in blood urea over time showed significant group effect. <b><i>Conclusions:</i></b> Predialytic supplementation was associated with better dialysis adequacy and urea removal than intradialytic supplementation. However, both timings were equally tolerated and not associated with underdialysis.

2015 ◽  
Vol 35 (suppl_1) ◽  
Author(s):  
Qiwei Wang ◽  
Zhenjie Liu ◽  
Jun Ren ◽  
Stephanie Morgan ◽  
Carmel Assa ◽  
...  

Objective: Abdominal aortic aneurysm (AAA) is a common vascular disease with a progressive nature. Currently, no pharmacological treatment is approved to effectively slow aneurysm growth or prevent rupture. We have recently demonstrated that receptor interacting protein kinase 3 (RIP3), a critical mediator of necroptosis, contributes to smooth muscle depletion and vascular inflammation associated with AAA. In this study, we tested the hypothesis that inhibition of necroptosis may mitigate aneurysm progression using Necrostatin-1 (Nec-1) or an optimized form of Nec-1 called Nec-1s (7-Cl-O-Nec-1), known inhibitors of another necroptosis mediator RIP1. Approach and Results: Using elastase perfusion model, we first demonstrated that Nec-1 attenuated aneurysm formation when administered daily by intraperitoneal (IP) injection started 30 min before aneurysm induction. Nec-1 also profoundly reduced elastin fragmentation, macrophage infiltration and SMC necrosis after elastase perfusion. To test whether RIP1 inhibitors can inhibit AAA progression, we randomly divided mice to four groups 7 days after elastase perfusion when aortic dilatation is small but significant. Group 1 was sacrificed to obtain a baseline aortic dilatation, while Group 2, 3, and 4 received daily IP injections of DMSO, Nec-1 (3.2 mg/kg/day) or Nec-1s (1.6mg/kg/day), respectively. 14 days after perfusion, mice in Group 2 displayed larger aneurysmal expansion as compared to Group 1 ( P <0.05), a reflection of aneurysm growth. In contrast, mice in Group 3 and 4 showed similar aortic dilatations compared to mice in Group 1 ( P >0.05), indicating insignificant aneurysmal growth. Furthermore, real-time PCR and histological analyses demonstrated that RIP1 inhibition significantly reduced aortic accumulation of proinflammatory cytokines and inflammatory cell infiltration. Conclusions: Taken together, our study suggests that necroptosis may serve as a therapeutic target for AAAs. Pharmacological inhibition of RIP1 kinase activity prevented aneurysm formation and stabilized pre-existing aneurysms in mice.


2017 ◽  
Vol 11 (02) ◽  
pp. 201-205 ◽  
Author(s):  
Nitika Bajaj ◽  
Prashant Monga ◽  
Pardeep Mahajan

ABSTRACT Objectives: To compare the dimensions of gutta-percha (GP) cones of ProTaper Next (25/0.06) and WaveOne (25/0.08) in relation to their corresponding instruments of the same dimension, respectively. Materials and Methods: Two groups of GP cones were made with 25 cones in each group. Group 1 consisted of 25 GP cones # 25/0.06 (ProTaper Next). Group 2 consisted of 25 GP cones # 25/0.08 (WaveOne). Measurements were done at D1 (1 mm short of the tip), D3 (3 mm short of the tip), and D11 (11 mm short of the tip) for GP cones of both groups and were compared with their corresponding instruments. Results: Group 1 (ProTaper) 25/.06 GP points showed greater diameters than those of the corresponding instrument, which was statistically significant. Group 2 (WaveOne) 25/0.08 GP points showed greater diameters than those of the corresponding instrument which was statistically significant whereas it was nonsignificant at level D1. Conclusion: Diameters of both ProTaper Next and WaveOne GP cones were greater than their corresponding instruments. Hence, there are chances of under obturation with both systems.


2009 ◽  
Vol 4 (1) ◽  
pp. 40-46 ◽  
Author(s):  
Gad Bar-Joseph ◽  
Yoav Guilburd ◽  
Ada Tamir ◽  
Joseph N. Guilburd

Object Deepening sedation is often needed in patients with intracranial hypertension. All widely used sedative and anesthetic agents (opioids, benzodiazepines, propofol, and barbiturates) decrease blood pressure and may therefore decrease cerebral perfusion pressure (CPP). Ketamine is a potent, safe, rapid-onset anesthetic agent that does not decrease blood pressure. However, ketamine's use in patients with traumatic brain injury and intracranial hypertension is precluded because it is widely stated that it increases intracranial pressure (ICP). Based on anecdotal clinical experience, the authors hypothesized that ketamine does not increase—but may rather decrease—ICP. Methods The authors conducted a prospective, controlled, clinical trial of data obtained in a pediatric intensive care unit of a regional trauma center. All patients were sedated and mechanically ventilated prior to inclusion in the study. Children with sustained, elevated ICP (> 18 mm Hg) resistant to first-tier therapies received a single ketamine dose (1–1.5 mg/kg) either to prevent further ICP increase during a potentially distressing intervention (Group 1) or as an additional measure to lower ICP (Group 2). Hemodynamic, ICP, and CPP values were recorded before ketamine administration, and repeated-measures analysis of variance was used to compare these values with those recorded every minute for 10 minutes following ketamine administration. Results The results of 82 ketamine administrations in 30 patients were analyzed. Overall, following ketamine administration, ICP decreased by 30% (from 25.8 ± 8.4 to 18.0 ± 8.5 mm Hg) (p < 0.001) and CPP increased from 54.4 ± 11.7 to 58.3 ± 13.4 mm Hg (p < 0.005). In Group 1, ICP decreased significantly following ketamine administration and increased by > 2 mm Hg during the distressing intervention in only 1 of 17 events. In Group 2, when ketamine was administered to lower persistent intracranial hypertension, ICP decreased by 33% (from 26.0 ± 9.1 to 17.5 ± 9.1 mm Hg) (p < 0.0001) following ketamine administration. Conclusions In ventilation-treated patients with intracranial hypertension, ketamine effectively decreased ICP and prevented untoward ICP elevations during potentially distressing interventions, without lowering blood pressure and CPP. These results refute the notion that ketamine increases ICP. Ketamine is a safe and effective drug for patients with traumatic brain injury and intracranial hypertension, and it can possibly be used safely in trauma emergency situations.


1979 ◽  
Vol 48 (1) ◽  
pp. 116-118
Author(s):  
Carl P. Gabbard ◽  
Charles H. Shea

Three groups of 4-yr.-old children were asked to complete a form perception assessment instrument prior to, 1 hr. after, and 1 wk. following a treatment. Group 1 participated in a movement-based form perception program, while Group 2 was instructed using a traditional classroom method. A third group which acted as control participated in unrelated movement activities. A repeated-measures analysis of variance gave a main effect of tests and an interaction of groups × tests. Group 2 displayed significantly higher performance on the posttest than Group 1; however, after 7 wk. the performance of Group 2 had decreased to a level below that of Groups 1 and 3, which remained stable.


2016 ◽  
Vol 2 (2) ◽  
Author(s):  
Shefali Walia ◽  
Majumi M. Noohu

Impaired balance has been associated with an increased risk for falls and a resulting increase in the mortality rate of elder people. Thus, balance-training interventions have an important place in fall prevention. This study was designed with the purpose of identifying the appropriate balance-training program for community dwelling elderly adults with an active lifestyle. A sample of 70 elderly adults were randomly allocated into two groups: group 1 (n=35) received general balance and mobility exercise; group 2 (n=35) received specific balance strategy training. The intervention consisted of 5 sessions/week for 4 weeks. The outcome measures were <em>Timed up and go test</em> (TUGT) and <em>Berg balance scale</em> (BBS). An inter-group (2-way mixed model analysis of co-variance) and intra-group (repeated measures) analysis was done to find the change in balance scores. After the intervention, the TUGT scores in group 1 were, mean=10.38 s, standard deviation (SD)=1.59 s and in group 2 were, mean=9.27 s, SD=1.13 s. Post training, BBS scores for group 1 were, mean=54.69, SD=1.13, and for group 2 were, mean=55.57, SD =0.56. There was a significant group × time effect for TUGT and BBS score. All the subjects showed significant changes in balance scores after balance training interventions. The subjects who participated in the specific balance-strategy training significantly improved their functional mobility, as shown on the TUGT, compared to the general training group.


2020 ◽  

Background: Cholecystectomy is a widespread abdominal procedure. A period of 8-hour-fasting for this relatively rapid surgery negatively affects the patients’ comfort. Objectives: The current study aimed to evaluate the effects of the presurgical intake of carbohydrate on patients’ comfort. Materials and Methods: This prospective study was carried out on 42 cholecystectomy patients (with the American Society of Anesthesiologists grade of I-II) divided into two groups. The patients in group 1 underwent laparoscopic cholecystectomy after an 8-hour-fasting period. The subjects in group 2 received a carbohydrate-rich solution with 12.5% dextrose before the surgery (125 g of sugar melted in 1 L of water; 800 and 200 mL 8 and 2 h before the surgery, respectively). Thirst, hunger, and nausea at the 9th preoperative hour and 30 min before the surgery in addition to nausea and vomiting at the 2nd, 8th, and 24th postoperative hours were assessed in both groups. Results: The mean age and body mass index (BMI) values of the patients were 48.38±12.68 years and 29.85±5.20 kg/m², respectively. The mean operational time was 36.5 min (range: 26-114 min). No difference was observed between the two groups in terms of age, BMI, and operational time. The investigation 30 min before cholecystectomy revealed that the rates of hungry and thirsty patients were higher in group 1, compared to those reported for group 2 (P=0.003 and P=0.032). Nevertheless, at the 2nd and 8th postoperative hours, the rate of patients complaining of nausea were higher in group 2 in comparison to that of group 1 (P=0.048 and P=0.014). Conclusions: It is suggested that the intake of carbohydrate-rich fluids up to the preoperative 2nd hour decreased presurgical hunger/thirst. The results of this study are in line with the findings of previous studies. It is believed that the intake of CHO-rich solutions up to 2 h before surgery may provide comfort by decreasing hunger/thirst. Nevertheless, it is necessary to take into account a potential rise in a feeling of nausea among these patients.


2017 ◽  
Vol 3 (4) ◽  
pp. 142-148
Author(s):  
Menelik M H Lee ◽  
Chao Ngan Chan ◽  
Betty Y T Lau ◽  
Teresa W L Ma

IntroductionCurrent evidence suggests annual training in the management of shoulder dystocia is adequate. The aim of this trial is to test our hypothesis that skills start to decline at 6 months after training and further decline at 12 months.MethodsIn this randomised, single-blinded study, 13 obstetricians and 51 midwives were randomly assigned to attend a 1-hour mixed lecture and simulation session on shoulder dystocia management. Training was conducted on group 2 at month ‘0’ and on group 1 at month ‘6’. Their knowledge scores (primary outcome) were assessed before (pre-training), immediately after the training (at-training) and retested at month ‘12’ (post-training).ResultsTwo-way repeated-measures analysis of variance showed a statistically significant interaction between the testing time frame (pre-training, at-training and post-training) on the score (p<0.001), but no significant interaction between the groups on the score (p=0.458).Compared to pre-training, the score increased after the simulation training (at-training) in both group 1 (8.69 vs 14.34, p<0.001) and group 2 (9.53 vs 14.66, p< 0.001), but decreased at 6 months post- training in group 1 (14.34 vs 11.71, p<0.001) and at 12 months post-training in group 2 (14.66 vs 11.96, p< 0.001). However the score was better than before the training. There was no significant difference in the post –training score (11.71vs 11.96, p=0.684) between both groups.ConclusionsOur study demonstrated that simulation training results in short-term and long-term improvement in shoulder dystocia management however knowledge degrades over time. Ongoing training is suggested at a minimum of 12 months’ interval for all members of the obstetrics team including midwives and doctors.


2011 ◽  
Vol 29 (4_suppl) ◽  
pp. 112-112
Author(s):  
V. Siripurapu ◽  
J. C. Watson ◽  
J. P. Hoffman

112 Background: Gastric Cancer (GC) remains a major cause of cancer related morbidity and mortality in Western Countries with five year survival rates between 30%-40%. Preoperative therapy has been championed by groups extrapolating data from the Intergroup 0116 and the MAGIC trials, with a view to enhancing completion of therapy and improving survival in locally advanced tumors. Methods: Patients with preoperative treatment of GC were reviewed from our tumor registry. Stages were assigned by AJCC 7th edition. A comparison between the ECF regimen and non-ECF chemoradiation regimens was performed to view patterns of pathologic complete response (pCR), recurrence, toxicity and overall survival. Results: Forty-two patients were identified and stratified into two groups; Group 1 ECF treatment arm (n = 16) compared to group 2 non-ECF chemo-radiation arm (n = 26). No statistical difference was noted in age, ethnicity or stage stratification. All of Group 1 received their chemotherapy regimen after 2005. In contrast, 60% of Group 2 patients received their treatment pre-2005. Only 56% the ECF group completed their treatment course (19% received other postoperative therapy). Seventy percent of group 2 received adjuvant chemotherapy. A grade 2 or higher toxicity was noted in 16% of Group 1 compared to 60% in Group 2 (p = 0.035). Seven complications were noted in the group 1 compared to 10 in group 2 (p = NS). The differentiation of tumor between groups was not significant (p = 0.97). Length of stay was significant (Group 1:9 days, Group 2:12 days, p = 0.02). More nodes were retrieved from group 1 versus group 2 (20.2 versus 15.2, p = 0.03). Group 1 had 3 recurrences (19%) while Group 2 had 11 recurrences (42%, p = 0.94). In both groups 80% of recurrences were distant. Group 1 had a 19% pCR versus 23% in group 2 (p = 0.79). Two-year survival was 70% in both groups, with a median survival of 51 months for group 2. Median survival was not reached for group 1. Conclusions: No difference was noted in pCR, recurrences, or survival between these two regimens. If this can be confirmed in larger, prospective, randomized trials, use of radiation and its potential morbidity may be avoided. No significant financial relationships to disclose.


2012 ◽  
Vol 30 (15_suppl) ◽  
pp. 6092-6092
Author(s):  
David Cella ◽  
Mellar P. Davis ◽  
Andrew G. Bushmakin ◽  
Joseph C Cappelleri ◽  
Elizabeth A Hahn ◽  
...  

6092 Background: Fatigue is common in cancer pts and associated with use of tyrosine kinase inhibitors (TKIs) such as SU. Limited data exist on the time pattern of fatigue with TKI therapy. Methods: Data from treatment-naïve mRCC pts in SU arms of two clinical trials were analyzed retrospectively. Study 1; 375 pts were randomized to SU 50 mg/d on a 4 weeks-on-2-weeks-off schedule (Schedule 4/2), for up to 30 cycles. Study 2; pts were randomized to SU 50 mg/d Schedule 4/2 (Group 1; n=146) or 37.5 mg/d continuous daily dosing (CDD; Group 2; n=146). In both trials, fatigue was measured with the question to pts: “I feel fatigued” over the past week (5-point rating scale, not at all-very much), and with the provider-rated Common Terminology Criteria for Adverse Events (CTCAE). In addition to descriptive profiles, Study 1 used two modeling approaches; repeated measures model (M1), with time as a categorical predictor; and random intercept-slope model (M2), with time as a continuous predictor. Study 2 calculated mean absolute values of within-cycle rate of change (from one assessment to the next) through the first 6 treatment cycles. Results: In Study 1, representing fatigue across cycles, M1 showed that the initial increase in patient-reported fatigue was worst during Cycle 1; mean values at all subsequent cycles were numerically better. For CTCAE fatigue, M1 showed that all but one of the pair-wise comparisons of the cycle means were not significantly different. M2 showed that the overall trend for patient-reported fatigue and CTCAE fatigue was not statistically different from zero. In Study 2, the mean absolute rate of change for fatigue during 6 treatment cycles was greater for Group 1 (4/2) compared to Group 2 (CDD): 0.042 vs. 0.032, respectively; P=0.003, t-test. Conclusions: In Study 1, pts reported notable fatigue in Cycle 1, which improved or stabilized, thereafter. In Study 2, Schedule 4/2 was associated with more within-cycle fluctuation in fatigue. These findings illustrate how SU-associated fatigue occurs early in therapy and continues with more within-cycle fluctuation associated with 4/2 dosing. This may help patient-clinician communications and interventions that support maintaining effective therapy.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Avetis Azizyan ◽  
Paula Eboli ◽  
Doniel Drazin ◽  
James Mirocha ◽  
Marcel M. Maya ◽  
...  

Objective. To determine whether angiomatous and microcystic meningiomas which mimic high grade meningiomas based on extent of peritumoral edema can be reliably differentiated as low grade tumors using normalized apparent diffusion coefficient (ADC) values.Methods. Preoperative magnetic resonance imaging (MRI) of seventy patients with meningiomas was reviewed. Morphologically, the tumors were divided into 3 groups. Group 1 contained 12 pure microcystic, 3 pure angiomatoid and 7 mixed angiomatoid and microcystic tumors. Group 2 included World Health Organization (WHO) grade II and WHO grade III tumors, of which 28 were atypical and 9 were anaplastic meningiomas. Group 3 included WHO grade I tumors of morphology different than angiomatoid and microcystic. Peritumoral edema, normalized ADC, and cerebral blood volume (CBV) were obtained for all meningiomas.Results. Edema index of tumors in group 1 and group 2 was significantly higher than in group 3. Normalized ADC value in group 1 was higher than in group 2, but not statistically significant between groups 1 and 3. CBV values showed no significant group differences.Conclusion. A combination of peritumoral edema index and normalized ADC value is a novel approach to preoperative differentiation between true aggressive meningiomas and mimickers such as angiomatous and microcystic meningiomas.


Sign in / Sign up

Export Citation Format

Share Document