scholarly journals Integrated System for easier and effective Drug Information

2019 ◽  
Vol 12 (3) ◽  
pp. 1069-1077
Author(s):  
Susmitha Shankar ◽  
S. Thangam

With the advent of new technologies, a large amount of biological data is easily generated in comparatively cheaper cost. Prior to this data integration was done by simple means of database addition, with less complexity due to lesser data generated in a standardized format. Understanding a complete biological phenomenon, such as disease, need a comprehensive understanding of many dimensions associated with it. This information cannot be captured in a single data type format. Mandating the use of a single data type study would leave us with incomplete answers to various biological questions. Thus the development of an effective integration technique with effective visualization platform is the need of the hour. One such framework requires the identification of relevant data from the input system, storing and transforming data into the intermediary level and then mapping these data into an appropriate position in the output systems. This intermediate level helps in reducing the number of connection and repeated specification creation. Integration of drug dataset would not only reduce the propagation of incorrect and not-updated medicinal information among doctors, but it would also help build better treatment strategies. Integration of drug data and visualization technique would be a novel approach to study drugs and effect on one platform. In this work, we tried to integrate the Adverse Effects, Drug Enforcement and Drug Label data from openFDA. This integrated database is coupled with a visualization platform IDEALS, an abbreviation for Integrated Drug Events, Adverse Effect and Label System.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Zekun Xu ◽  
Eric Laber ◽  
Ana-Maria Staicu ◽  
B. Duncan X. Lascelles

AbstractOsteoarthritis (OA) is a chronic condition often associated with pain, affecting approximately fourteen percent of the population, and increasing in prevalence. A globally aging population have made treating OA-associated pain as well as maintaining mobility and activity a public health priority. OA affects all mammals, and the use of spontaneous animal models is one promising approach for improving translational pain research and the development of effective treatment strategies. Accelerometers are a common tool for collecting high-frequency activity data on animals to study the effects of treatment on pain related activity patterns. There has recently been increasing interest in their use to understand treatment effects in human pain conditions. However, activity patterns vary widely across subjects; furthermore, the effects of treatment may manifest in higher or lower activity counts or in subtler ways like changes in the frequency of certain types of activities. We use a zero inflated Poisson hidden semi-Markov model to characterize activity patterns and subsequently derive estimators of the treatment effect in terms of changes in activity levels or frequency of activity type. We demonstrate the application of our model, and its advance over traditional analysis methods, using data from a naturally occurring feline OA-associated pain model.


Author(s):  
Flávio Craveiro ◽  
João Meneses de Matos ◽  
Helena Bártolo ◽  
Paulo Bártolo

Traditionally the construction sector is very conservative, risk averse and reluctant to adopt new technologies and ideas. The construction industry faces great challenges to develop more innovative and efficient solutions. In recent years, significant advances in technology and more sustainable urban environments has been creating numerous opportunities for innovation in automation. This paper proposes a new system based on extrusion-based technologies aiming at solving some limitations of current technologies to allow a more efficient building construction with organic forms and geometries, based on sustainable eco principles. This novel approach is described through a control deposition software. Current modeling techniques focus only on capturing the geometric information and cannot satisfy the requirements from modeling the components made of multi-heterogeneous materials. There is a great deal of interest in tailoring structures so the functional requirements can vary with location. The proposed functionally graded material deposition (FGM) system will allow a smooth variation of material properties to build up more efficient buildings regarding thermal, acoustic and structural conditions.


2017 ◽  
Vol 5 (1) ◽  
pp. 7-22
Author(s):  
Katarina Steen Carlsson ◽  
Bengt Jönsson

What is the actual value of new medicines? The answer to this question is the key to rational use of new technologies in health care and for design of appropriate incentives for innovation. In this paper we present methods, data and study results for valuing new medical technologies in a life cycle perspective, relevant for development of a new approach to contract and payment for innovation that can replace present systems for pricing and reimbursement.   Focus is on value in clinical practice, and on the data needs and methods needed for the development of outcome-based payment systems that balances risks and rewards for innovation in health care. We provide an overview of studies from the Swedish context on the value of new medicines introduced in the treatment of diabetes, cancer, cardiovascular disease and rheumatoid arthritis. These studies using national health data and quality registers emphasise the importance of continuing efforts to collect relevant data for assessment of value after a medicine reaches the market and starts to be used in clinical practice. It is only when medicines are used in clinical practice that the benefits for real-world patient populations can be identified, measured and valued. Analyses of real-world data will also assist further development and tailoring of treatment strategies to optimize the value of the new technology. While an effective patent system rewards innovation for a limited period of time, many innovations may continue to provide value to society long after patent protection, and these values must be included in the assessment of value of innovation.


Leonardo ◽  
2013 ◽  
Vol 46 (3) ◽  
pp. 270-271 ◽  
Author(s):  
Miriah Meyer

Visualization is now a vital component of the biological discovery process. This article presents visualization design studies as a promising approach for creating effective, visualization tools for biological data.


Author(s):  
Diego Milone ◽  
Georgina Stegmayer ◽  
Matías Gerard ◽  
Laura Kamenetzky ◽  
Mariana López ◽  
...  

The volume of information derived from post genomic technologies is rapidly increasing. Due to the amount of involved data, novel computational methods are needed for the analysis and knowledge discovery into the massive data sets produced by these new technologies. Furthermore, data integration is also gaining attention for merging signals from different sources in order to discover unknown relations. This chapter presents a pipeline for biological data integration and discovery of a priori unknown relationships between gene expressions and metabolite accumulations. In this pipeline, two standard clustering methods are compared against a novel neural network approach. The neural model provides a simple visualization interface for identification of coordinated patterns variations, independently of the number of produced clusters. Several quality measurements have been defined for the evaluation of the clustering results obtained on a case study involving transcriptomic and metabolomic profiles from tomato fruits. Moreover, a method is proposed for the evaluation of the biological significance of the clusters found. The neural model has shown a high performance in most of the quality measures, with internal coherence in all the identified clusters and better visualization capabilities.


2008 ◽  
pp. 1696-1705
Author(s):  
George Tzanis ◽  
Christos Berberidis ◽  
Ioannis Vlahavas

At the end of the 1980s, a new discipline named data mining emerged. The introduction of new technologies such as computers, satellites, new mass storage media, and many others have lead to an exponential growth of collected data. Traditional data analysis techniques often fail to process large amounts of, often noisy, data efficiently in an exploratory fashion. The scope of data mining is the knowledge extraction from large data amounts with the help of computers. It is an interdisciplinary area of research that has its roots in databases, machine learning, and statistics and has contributions from many other areas such as information retrieval, pattern recognition, visualization, parallel and distributed computing. There are many applications of data mining in the real world. Customer relationship management, fraud detection, market and industry characterization, stock management, medicine, pharmacology, and biology are some examples (Two Crows Corporation, 1999).


Author(s):  
Jayati Das-Munshi ◽  
Tamsin Ford ◽  
Matthew Hotopf ◽  
Martin Prince ◽  
Robert Stewart

In this final chapter to the second edition of Practical Psychiatric Epidemiology, developments in psychiatric epidemiology since the first edition are summarized and the editors offer a view on where the future may lie. The themes summarized in this chapter include those related to large-scale datasets or ‘big data’, new technologies and science communication (including data generated through GPS tracking systems and the impact of social media), expanding biological data and biobanks, as well as the impact of globalization, migration, and culture on understanding psychiatric epidemiological principles. The last part of this chapter raises the important issue of open science initiatives. The chapter concludes with a brief discussion on the constancy and ongoing evolution of psychiatric epidemiology.


Author(s):  
Judy C.R. Tseng ◽  
Wen-Ling Tsai ◽  
Gwo-Jen Hwang ◽  
Po-Han Wu

In developing traditional learning materials, quality is the key issue to be considered. However, for high technical e-training courses, not only the quality of the learning materials but also the efficiency of developing the courses needs to be taken into consideration. It is a challenging issue for experienced engineers to develop up-to-date e-training courses for inexperienced engineers before further new technologies are proposed. To cope with these problems, a concept relationship-oriented approach is proposed in this paper. A system for developing e-training courses has been implemented based on the novel approach. Experimental results showed that the novel approach can significantly shorten the time needed for developing e-training courses, such that engineers can receive up-to-date technologies in time.


Author(s):  
George Tzanis ◽  
Christos Berberidis ◽  
Ioannis Vlahavas

At the end of the 1980s, a new discipline named data mining emerged. The introduction of new technologies such as computers, satellites, new mass storage media, and many others have lead to an exponential growth of collected data. Traditional data analysis techniques often fail to process large amounts of, often noisy, data efficiently in an exploratory fashion. The scope of data mining is the knowledge extraction from large data amounts with the help of computers. It is an interdisciplinary area of research that has its roots in databases, machine learning, and statistics and has contributions from many other areas such as information retrieval, pattern recognition, visualization, parallel and distributed computing. There are many applications of data mining in the real world. Customer relationship management, fraud detection, market and industry characterization, stock management, medicine, pharmacology, and biology are some examples (Two Crows Corporation, 1999).


2020 ◽  
Vol 16 (3) ◽  
pp. 70-85
Author(s):  
Odai Y. Khasawneh

The lack of technology acceptance in the workplace has haunted companies in the past and it seems that it will continue to do so in the future. One of the many variables that impact employees' acceptance of a new technology is technophobia; which previously has been studied within the narrow context of computers or few other technologies that are now outdated. In a novel approach, the current study examines employees' technophobia and how it impacts their technology acceptance. In addition, the moderating influence of transformational leadership is studied to determine whether that type of leadership would influence employees to overcome their technophobia. The data analysis confirms that technophobia and its subdimensions are still an issue that haunts the workplace. However, having a leader who's identified as a transformational leader can help employees overcome their technophobia. This study argues that it is vital for companies to understand the level and type of technophobia as well as what type of leadership their employees have before implementing any new technologies.


Sign in / Sign up

Export Citation Format

Share Document