scholarly journals Visualization of Biomedical Data

Author(s):  
Seán I O'Donoghue ◽  
Bernadetta F Baldi ◽  
Susan J Clark ◽  
Aaron E Darling ◽  
James M Hogan ◽  
...  

The rapid increase in volume and complexity of biomedical data requires changes in research, communication, training, and clinical practices. This includes learning how to effectively integrate automated analysis with high-data-density visualizations that clearly express complex phenomena. In this review, we summarize key principles and resources from data visualization research that address this difficult challenge. We then survey how visualization is being used in a selection of emerging biomedical research areas, including: 3D genomics, single-cell RNA-seq, the protein structure universe, phosphoproteomics, augmented-reality surgery, and metagenomics. While specific areas need highly tailored visualization tools, there are common visualization challenges that can be addressed with general methods and strategies. Unfortunately, poor visualization practices are also common; however, there are good prospects for improvements and innovations that will revolutionize how we see and think about our data. We outline initiatives aimed at fostering these improvements via better tools, peer-to-peer learning, and interdisciplinary collaboration with computer scientists, science communicators, and graphic designers.

2018 ◽  
Author(s):  
Seán I O'Donoghue ◽  
Benedetta F Baldi ◽  
Susan J Clark ◽  
Aaron E Darling ◽  
James M Hogan ◽  
...  

The rapid increase in volume and complexity of biomedical data requires changes in research, communication, training, and clinical practices. This includes learning how to effectively integrate automated analysis with high-data-density visualizations that clearly express complex phenomena. In this review, we summarize key principles and resources from data visualization research that address this difficult challenge. We then survey how visualization is being used in a selection of emerging biomedical research areas, including: 3D genomics, single-cell RNA-seq, the protein structure universe, phosphoproteomics, augmented-reality surgery, and metagenomics. While specific areas need highly tailored visualization tools, there are common visualization challenges that can be addressed with general methods and strategies. Unfortunately, poor visualization practices are also common; however, there are good prospects for improvements and innovations that will revolutionize how we see and think about our data. We outline initiatives aimed at fostering these improvements via better tools, peer-to-peer learning, and interdisciplinary collaboration with computer scientists, science communicators, and graphic designers.


2018 ◽  
Vol 1 (1) ◽  
pp. 275-304 ◽  
Author(s):  
Seán I. O'Donoghue ◽  
Benedetta Frida Baldi ◽  
Susan J. Clark ◽  
Aaron E. Darling ◽  
James M. Hogan ◽  
...  

The rapid increase in volume and complexity of biomedical data requires changes in research, communication, and clinical practices. This includes learning how to effectively integrate automated analysis with high–data density visualizations that clearly express complex phenomena. In this review, we summarize key principles and resources from data visualization research that help address this difficult challenge. We then survey how visualization is being used in a selection of emerging biomedical research areas, including three-dimensional genomics, single-cell RNA sequencing (RNA-seq), the protein structure universe, phosphoproteomics, augmented reality–assisted surgery, and metagenomics. While specific research areas need highly tailored visualizations, there are common challenges that can be addressed with general methods and strategies. Also common, however, are poor visualization practices. We outline ongoing initiatives aimed at improving visualization practices in biomedical research via better tools, peer-to-peer learning, and interdisciplinary collaboration with computer scientists, science communicators, and graphic designers. These changes are revolutionizing how we see and think about our data.


2020 ◽  
Vol 3 (1) ◽  
pp. 43-59
Author(s):  
Peter M. Kasson

Infectious disease research spans scales from the molecular to the global—from specific mechanisms of pathogen drug resistance, virulence, and replication to the movement of people, animals, and pathogens around the world. All of these research areas have been impacted by the recent growth of large-scale data sources and data analytics. Some of these advances rely on data or analytic methods that are common to most biomedical data science, while others leverage the unique nature of infectious disease, namely its communicability. This review outlines major research progress in the past few years and highlights some remaining opportunities, focusing on data or methodological approaches particular to infectious disease.


2016 ◽  
Vol 30 (7) ◽  
pp. 1-6
Author(s):  
Karen McAulay

Purpose The present paper describes an Arts and Humanities Research Council (AHRC) research project into Scottish fiddle music and the important considerations of music digitization, access and discovery in designing the website that will be one of the project’s enduring outcomes. Design/methodology/approach The paper is a general review of existing online indices to music repertoires and some of the general problems associated with selecting metadata and indexing such material and is a survey of the various recent and contemporary projects into the digital encoding of musical notation for online use. Findings The questions addressed during the design of the Bass Culture project database serve to highlight the importance of cooperation between musicologists, information specialists and computer scientists, and the benefits of having researchers with strengths in more than one of these disciplines. The Music Encoding Initiative proves an effective means of providing digital access to the Scottish fiddle tune repertoire. Originality/value The digital encoding of music notation is still comparatively cutting-edge; the Bass Culture project is thus a useful exemplar for interdisciplinary collaboration between musicologists, information specialists and computer scientists, and it addresses issues which are likely to be applicable to future projects of this nature.


1996 ◽  
Vol 35 (3) ◽  
pp. 511-540 ◽  
Author(s):  
Dominique Lestel

The use of computers has opened access to complex phenomena for the comprehension of which no operational narrative traditions are available. Notions of “life”, “cognition” and “intelligence” constitute metaphors and procedures for description and understanding that make it possible to discuss these phenomena, however. They represent cognitive resources for scientists. Why do computer scientists “play” at being biologists, and why do they view it as essential to naturalize their artifacts? When this question is taken as the starting point, it becomes possible to outline what an anthropological study of relations to complexity might look like. For “artificial life”, the outcome is a faustian attitude, implying the creation not of life, pure and simple, but of all possible forms of life. Most importantly, this “Godly discourse” goes along with the development of a truly astonishing object — self-modifiable, adaptable and evolutionary mimetic programs. There is no place for these surprising artifacts in the narrative traditions by means of which scholars may describe and account for them. To examine the all-pervasive but constantly denied language-related dimension of experimentation in artificial life, in an attempt to reach a more intimate understanding of how a purely playful technical project may be transformed into a grandiose metaphysical program, points to two major characteristics of such discourse, which have attracted little attention so far: its insistence on staging parallel, manipulatable and acceleratable temporal sequences for the phenomena observed, as well as an obdurate, painstaking will to exclude everything human from these worlds, which must be perfectly and even hermetically sealed off, this being perceived as a precondition for real life. One direct consequence of these radical positions is that they cut off artificial life from its richest heritage, and in particular from its forefathers in the world of art. One major consequence of this research on artificial life is the reformulation of where we cross boundaries in our culture, and rethinking the status of human beings.


2021 ◽  
Author(s):  
Walid Ben Ali ◽  
Ahmad Pesaranghader ◽  
Robert Avram ◽  
Reda Ibrahim ◽  
Thomas Modine ◽  
...  

Driven by recent innovations and technological progress, the increasing quality and amount of biomedical data coupled with the advances in computing power allowed for much progress in artificial intelligence (AI) approaches for health and biomedical research. In interventional cardiology, the hope is for AI to provide automated analysis and deeper interpretation of data from electrocardiography, computed tomography, magnetic resonance imaging, and electronic health records, among others. Furthermore, high-performance predictive models supporting decision-making hold the potential to improve safety, diagnostic and prognostic prediction in patients undergoing interventional cardiology procedures. These applications include robotic-assisted percutaneous coronary intervention procedures and automatic assessment of coronary stenosis during diagnostic coronary angiograms. Machine learning (ML) has been used in these innovations that have improved the field of interventional cardiology, and more recently, deep learning (DL) has emerged as one of the most successful branches of ML in many applications. It remains to be seen if DL approaches will have a major impact on current and future practice. DL-based predictive systems also have several limitations, including lack of interpretability and lack of generalizability due to cohort heterogeneity and low sample sizes. There are also challenges for the clinical implementation of these systems, such as ethical limits and data privacy. This review is intended to bring the attention of health practitioners and interventional cardiologists to the broad and helpful applications of ML and DL algorithms to date in the field. Their implementation challenges in daily practice and future applications in the field of interventional cardiology are also discussed.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Cornelius König ◽  
Andrew Demetriou ◽  
Philipp Glock ◽  
Annemarie Hiemstra ◽  
Dragos Iliescu ◽  
...  

This article is based on conversations from the project “Big Data in Psychological Assessment” (BDPA) funded by the European Union, which was initiated because of the advances in data science and artificial intelligence that offer tremendous opportunities for personnel assessment practice in handling and interpreting this kind of data. We argue that psychologists and computer scientists can benefit from interdisciplinary collaboration. This article aims to inform psychologists who are interested in working with computer scientists about the potentials of interdisciplinary collaboration, as well as the challenges such as differing terminologies, foci of interest, data quality standards, approaches to data analyses, and diverging publication practices. Finally, we provide recommendations preparing psychologists who want to engage in collaborations with computer scientists. We argue that psychologists should proactively approach computer scientists, learn computer scientific fundamentals, appreciate that research interests are likely to converge, and prepare novice psychologists for a data-oriented scientific future.


Author(s):  
Mouad Addad ◽  
Ali Djebbari

In order to meet the demand of high data rate transmission with good quality maintained, the multi-carrier code division multiple access (MC-CDMA) technology is considered for the next generation wireless communication systems. However, their high crest factor (CF) is one of the major drawbacks of multi-carrier transmission systems. Thus, CF reduction is one of the most important research areas in MC-CDMA systems. In addition, asynchronous MC-CDMA suffers from the effect of multiple access interference (MAI), caused by all users active in the system. Degradation of the system’s bit error rate (BER) caused by MAI must be taken into consideration as well. The aim of this paper is to provide a comparative study on the enhancement of performance of an MC-CDMA system. The spreading sequences used in CDMA play an important role in CF and interference reduction. Hence, spreading sequences should be selected to simultaneously ensure low CF and low BER values. Therefore, the effect that correlation properties of sequences exert on CF values is investigated in this study. Furthermore, a numerical BER evaluation, as a function of the signal-to-noise ratio (SNR) and the number of users, is provided. The results obtained indicate that a trade-off between the two criteria is necessary to ensure good performance. It was concluded that zero correlation zone (ZCZ) sequences are the most suitable spreading sequences as far as the satisfaction of the above criteria is concerned.


2019 ◽  
Vol 43 (3) ◽  
pp. 169-179 ◽  
Author(s):  
Nicole Llewellyn ◽  
Dorothy R. Carter ◽  
Deborah DiazGranados ◽  
Clara Pelfrey ◽  
Latrice Rollins ◽  
...  

The Clinical and Translational Science Awards (CTSA) program sponsors an array of innovative, collaborative research. This study uses complementary bibliometric approaches to assess the scope, influence, and interdisciplinary collaboration of publications supported by single CTSA hubs and those supported by multiple hubs. Authors identified articles acknowledging CTSA support and assessed the disciplinary scope of research areas represented in that publication portfolio, their citation influence, interdisciplinary overlap among research categories, and characteristics of publications supported by multihub collaborations. Since 2006, CTSA hubs supported 69,436 articles published in 4,927 journals and 189 research areas. The portfolio is well distributed across diverse research areas with above-average citation influence. Most supported publications involved clinical/health sciences, for example, neurology and pediatrics; life sciences, for example, neuroscience and immunology; or a combination of the two. Publications supported by multihub collaborations had distinct content emphasis, stronger citation influence, and greater interdisciplinary overlap. This study characterizes the CTSA consortium’s contributions to clinical and translational science, identifies content areas of strength, and provides evidence for the success of multihub collaborations. These methods lay the foundation for future investigation of the best policies and priorities for fostering translational science and allow hubs to understand their progress benchmarked against the larger consortium.


Sign in / Sign up

Export Citation Format

Share Document