scholarly journals SPECIFICATIONS OF DIFFERENT IOP METHODS SPECIFICATIONS

2019 ◽  
Vol 34 (4) ◽  
pp. 1009-1013
Author(s):  
Strahil Gazepov ◽  
Alen Georgiev

IOP is an important risk factor for glaucoma, and lowering IOP, even when IOP is within the normal range as defined epidemiologically, remains the only proven-effective treatment of the disease.However, our knowledge of the true nature of IOP in humans or how it affects ocular tissues is partially limited by the lack of continuous IOP monitoring technology for patients.Although clinical IOP reduction remains the only proven method for preventing the occurrence and progression of glaucoma, the role of IOP in the development and progression of the disease is not well understood.This is largely due to clinical observation that a significant number of patients with normal IOP develop glaucoma, while other individuals with elevated IOP show no signs of disease.This may mean that IOP (or some IOP-driven factor) is the primary triggering factor in glaucoma, and IOP vulnerability varies among individuals.Another possibility is that the clinical characterization of the secondary IOP using rare imaging measurements fails to capture the exposure to harmful IOP fluctuations that partially drive the disease in these normotensive patients with glaucoma, which contributes to the IOP-gloomy relationship.Recent data suggest that the IOP varies around 5 mmHg daily and hourly, and from 15 to 40 mmHg per second when measured continuously in awake patients.Very little is known about IOP fluctuations in humans and how The eye responds to these fluctuations, but IOP levels at all time points have the potential to cause disturbances in the neuroretic layer.Intraocular pressure assessment (IOP) is a key phase of routine eye examination, especially for patients with glaucoma. Indeed, in these cases, elevated IOP is the only risk factor that physicians can modify.This is why the IOP value is important to the patient: it is a key element in the diagnosis and management of glaucoma.The IOP depends on the speed of movement and the rate of leakage of the guiding light, ie it depends on the flow resistance ofн the drainage channels and the amount of episcleral venous pressure.Glaucoma is a slowly progressive neuropathy with changes in the optic nerve, retinalneurofibrillary layer (RNFL) and visual field.

2018 ◽  
Vol 2018 ◽  
pp. 1-6 ◽  
Author(s):  
Stefano Gambardella ◽  
Rosangela Ferese ◽  
Simona Scala ◽  
Stefania Carboni ◽  
Francesca Biagioni ◽  
...  

Deletion at 22q11.2 responsible for Di George syndrome (DGs) is a risk factor for early-onset Parkinson’s disease (EOPD). To date, all patients reported with 22q11.2 deletions and parkinsonian features are negative for a family history of PD, and possible mutations in PD-related genes were not properly evaluated. The goal of this paper was to identify variants in PD genes that could contribute, together with 22q11.2 del, to the onset of parkinsonian features in patients affected by Di George syndrome. To this aim, sequencing analysis of 4800 genes including 17 PD-related genes was performed in a patient affected by DGs and EOPD. The analysis identified mutation p.Gly399Ser in OMI/HTRA2 (PARK13). To date, the mechanism that links DGs with parkinsonian features is poorly understood. The identification of a mutation in a PARK gene suggests that variants in PD-related genes, or in genes still not associated with PD, could contribute, together with deletion at 22q11.2, to the EOPD in patients affected by DGs. Further genetic analyses in a large number of patients are strongly required to understand this mechanism and to establish the pathogenetic role of p.Gly399Ser in OMI/HTRA2.


2015 ◽  
Vol 29 (1) ◽  
pp. 3-28 ◽  
Author(s):  
Daron Acemoglu ◽  
James A. Robinson

Thomas Piketty's (2013) book, Capital in the 21st Century, follows in the tradition of the great classical economists, like Marx and Ricardo, in formulating general laws of capitalism to diagnose and predict the dynamics of inequality. We argue that general economic laws are unhelpful as a guide to understanding the past or predicting the future because they ignore the central role of political and economic institutions, as well as the endogenous evolution of technology, in shaping the distribution of resources in society. We use regression evidence to show that the main economic force emphasized in Piketty's book, the gap between the interest rate and the growth rate, does not appear to explain historical patterns of inequality (especially, the share of income accruing to the upper tail of the distribution). We then use the histories of inequality of South Africa and Sweden to illustrate that inequality dynamics cannot be understood without embedding economic factors in the context of economic and political institutions, and also that the focus on the share of top incomes can give a misleading characterization of the true nature of inequality.


Author(s):  
L. T. Germinario

Understanding the role of metal cluster composition in determining catalytic selectivity and activity is of major interest in heterogeneous catalysis. The electron microscope is well established as a powerful tool for ultrastructural and compositional characterization of support and catalyst. Because the spatial resolution of x-ray microanalysis is defined by the smallest beam diameter into which the required number of electrons can be focused, the dedicated STEM with FEG is the instrument of choice. The main sources of errors in energy dispersive x-ray analysis (EDS) are: (1) beam-induced changes in specimen composition, (2) specimen drift, (3) instrumental factors which produce background radiation, and (4) basic statistical limitations which result in the detection of a finite number of x-ray photons. Digital beam techniques have been described for supported single-element metal clusters with spatial resolutions of about 10 nm. However, the detection of spurious characteristic x-rays away from catalyst particles produced images requiring several image processing steps.


2012 ◽  
Vol 15 (2) ◽  
pp. 84 ◽  
Author(s):  
Canturk Cakalagaoglu ◽  
Cengiz Koksal ◽  
Ayse Baysal ◽  
Gokhan Alici ◽  
Birol Ozkan ◽  
...  

<p><b>Aim:</b> The goal was to determine the effectiveness of the posterior pericardiotomy technique in preventing the development of early and late pericardial effusions (PEs) and to determine the role of anxiety level for the detection of late pericardial tamponade (PT).</p><p><b>Materials and Methods:</b> We divided 100 patients randomly into 2 groups, the posterior pericardiotomy group (n = 50) and the control group (n = 50). All patients undergoing coronary artery bypass grafting surgery (CABG), valvular heart surgery, or combined valvular and CABG surgeries were included. The posterior pericardiotomy technique was performed in the first group of 50 patients. Evaluations completed preoperatively, postoperatively on day 1, before discharge, and on postoperative days 5 and 30 included electrocardiographic study, chest radiography, echocardiographic study, and evaluation of the patient's anxiety level. Postoperative causes of morbidity and durations of intensive care unit and hospital stays were recorded.</p><p><b>Results:</b> The 2 groups were not significantly different with respect to demographic and operative data (<i>P</i> > .05). Echocardiography evaluations revealed no significant differences between the groups preoperatively; however, before discharge the control group had a significantly higher number of patients with moderate, large, and very large PEs compared with the pericardiotomy group (<i>P</i> < .01). There were 6 cases of late PT in the control group, whereas there were none in the pericardiotomy group (<i>P</i> < .05). Before discharge and on postoperative day 15, the patients in the pericardiotomy group showed significant improvement in anxiety levels (<i>P</i> = .03 and .004, respectively). No differences in postoperative complications were observed between the 2 groups.</p><p><b>Conclusion:</b> Pericardiotomy is a simple, safe, and effective method for reducing the incidence of PE and late PT after cardiac surgery. It also has the potential to provide a better quality of life.</p>


Author(s):  
Natalia Carolina Petrillo

ResumenEn el presente trabajo se intentará mostrar que la fenomenología no conduce a una postura solipsista. Para ello, se caracterizará en qué consiste el solipsismo. Luego, se intentará refutar a lo que se ha de llamar “solipsismo metafísico” y “solipsismo gnoseológico”, con el objetivo principal de poner de manifiesto el fundamento de motivación para la salida de la ficción solipsista.Palabras claves:Phenomenology – solipsim – empatía - HusserlAbstractWith the aim of showing that phenomenology does not lead in solipsism, I will first attempt a characterization of it. Then, I will attempt a refutation of the so-called “metaphysical” and “epistemological” solipsisms. Finally, the nature and role of Husserl´s solipsistic fiction is examined, and the grounds that motivate the overcoming of this standpoint are disclosed.key wordsFenomenología – solipsismo - empathy – Husserl


2020 ◽  
Vol 11 (1) ◽  
pp. 144-148
Author(s):  
Liuba Zlatkova ◽  

The report describes the steps for creating a musical tale by children in the art studios of „Art Workshop“, Shumen. These studios are led by students volunteers related to the arts from pedagogical department of Shumen University, and are realized in time for optional activities in the school where the child studies. The stages of creating a complete product with the help of different arts are traced – from the birth of the idea; the creation of a fairy tale plot by the children; the characterization of the fairy-tale characters; dressing them in movement, song and speech; creating sets and costumes and creating a finished product to present on stage. The role of parents as a link and a necessary helper for children and leaders is also considered, as well as the positive psychological effects that this cooperation creates.


Sign in / Sign up

Export Citation Format

Share Document