Health informatics. Standard communication protocol. Computer-assisted electrocardiography

2015 ◽  
Hearts ◽  
2021 ◽  
Vol 2 (3) ◽  
pp. 384-409
Author(s):  
Paul Rubel ◽  
Jocelyne Fayn ◽  
Peter W. Macfarlane ◽  
Danilo Pani ◽  
Alois Schlögl ◽  
...  

Ever since the first publication of the standard communication protocol for computer-assisted electrocardiography (SCP-ECG), prENV 1064, in 1993, by the European Committee for Standardization (CEN), SCP-ECG has become a leading example in health informatics, enabling open, secure, and well-documented digital data exchange at a low cost, for quick and efficient cardiovascular disease detection and management. Based on the experiences gained, since the 1970s, in computerized electrocardiology, and on the results achieved by the pioneering, international cooperative research on common standards for quantitative electrocardiography (CSE), SCP-ECG was designed, from the beginning, to empower personalized medicine, thanks to serial ECG analysis. The fundamental concept behind SCP-ECG is to convey the necessary information for ECG re-analysis, serial comparison, and interpretation, and to structure the ECG data and metadata in sections that are mostly optional in order to fit all use cases. SCP-ECG is open to the storage of the ECG signal and ECG measurement data, whatever the ECG recording modality or computation method, and can store the over-reading trails and ECG annotations, as well as any computerized or medical interpretation reports. Only the encoding syntax and the semantics of the ECG descriptors and of the diagnosis codes are standardized. We present all of the landmarks in the development and publication of SCP-ECG, from the early 1990s to the 2009 International Organization for Standardization (ISO) SCP-ECG standards, including the latest version published by CEN in 2020, which now encompasses rest and stress ECGs, Holter recordings, and protocol-based trials.


Author(s):  
Sudeep D. Thepade ◽  
Gaurav Ramnani

Melanoma is a mortal type of skin cancer. Early detection of melanoma significantly improves the patient’s chances of survival. Detection of melanoma at an early juncture demands expert doctors. The scarcity of such expert doctors is a major issue with healthcare systems globally. Computer-assisted diagnostics may prove helpful in this case. This paper proposes a health informatics system for melanoma identification using machine learning with dermoscopy skin images. In the proposed method, the features of dermoscopy skin images are extracted using the Haar wavelet pyramid various levels. These features are employed to train machine learning algorithms and ensembles for melanoma identification. The consideration of higher levels of Haar Wavelet Pyramid helps speed up the identification process. It is observed that the performance gradually improves from the Haar wavelet pyramid level 4x4 to 16x16, and shows marginal improvement further. The ensembles of machine learning algorithms have shown a boost in performance metrics compared to the use of individual machine learning algorithms.


1989 ◽  
Vol 28 (04) ◽  
pp. 313-320 ◽  
Author(s):  
W. Schneider

Abstract:An alternative way of teaching informatics, especially health informatics, to health professionals of different categories has been developed and practiced. The essentials of human competence and skill in handling and processing information are presented parallel with the essentials of computer-assisted methodologies and technologies of formal language-based informatics. Requirements on how eventually useful computer-based tools will have to be designed in order to be well adapted to genuine human skill and competence in handling tools in various work contexts are established.On the basis of such a balanced knowledge methods for work analysis are introduced. These include how the existing problems at a workplace can be identified and analyzed in relation to the goals to be achieved. Special emphasis is given to new ways of information analysis, i.e. methods which even allow the comprehension and documentation ofthose parts of the actually practiced ´human´ information handling and processing which are normally overlooked, as e. g. non-verbal communication processes and so-called ´tacit knowledge´ based information handling and processing activities. Different ways of problem solving are discussed involving in an integrated human perspective - alternative staffing, enhancement of the competence of the staff, optimal planning of premises as well as organizational and technical means. The main result of this alternative way of education has been a considerably improved user competence which in turn has led to very different designs of computer assistance and man-computer interfaces.It is the purpose of this paper to give a brief outline of the teaching material and a short presentation of the above mentioned results. Special emphasis is given to that part of the course where computer assisted interactive media technology presently is introduced due to the fact that this is the only way of adequately presenting some of the most important parts of human performance in handling and processing information, including communication.


2020 ◽  
Vol 10 (23) ◽  
pp. 8606
Author(s):  
Alfonso Monaco ◽  
Nicola Amoroso ◽  
Loredana Bellantuono ◽  
Ester Pantaleo ◽  
Sabina Tangaro ◽  
...  

The COVID-19 pandemic has amplified the urgency of the developments in computer-assisted medicine and, in particular, the need for automated tools supporting the clinical diagnosis and assessment of respiratory symptoms. This need was already clear to the scientific community, which launched an international challenge in 2017 at the International Conference on Biomedical Health Informatics (ICBHI) for the implementation of accurate algorithms for the classification of respiratory sound. In this work, we present a framework for respiratory sound classification based on two different kinds of features: (i) short-term features which summarize sound properties on a time scale of tenths of a second and (ii) long-term features which assess sounds properties on a time scale of seconds. Using the publicly available dataset provided by ICBHI, we cross-validated the classification performance of a neural network model over 6895 respiratory cycles and 126 subjects. The proposed model reached an accuracy of 85%±3% and an precision of 80%±8%, which compare well with the body of literature. The robustness of the predictions was assessed by comparison with state-of-the-art machine learning tools, such as the support vector machine, Random Forest and deep neural networks. The model presented here is therefore suitable for large-scale applications and for adoption in clinical practice. Finally, an interesting observation is that both short-term and long-term features are necessary for accurate classification, which could be the subject of future studies related to its clinical interpretation.


Author(s):  
E. T. O'Toole ◽  
R. R. Hantgan ◽  
J. C. Lewis

Thrombocytes (TC), the avian equivalent of blood platelets, support hemostasis by aggregating at sites of injury. Studies in our lab suggested that fibrinogen (fib) is a requisite cofactor for TC aggregation but operates by an undefined mechanism. To study the interaction of fib with TC and to identify fib receptors on cells, fib was purified from pigeon plasma, conjugated to colloidal gold and used both to facilitate aggregation and as a receptor probe. Described is the application of computer assisted reconstruction and stereo whole mount microscopy to visualize the 3-D organization of fib receptors at sites of cell contact in TC aggregates and on adherent cells.Pigeon TC were obtained from citrated whole blood by differential centrifugation, washed with Ca++ free Hank's balanced salts containing 0.3% EDTA (pH 6.5) and resuspended in Ca++ free Hank's. Pigeon fib was isolated by precipitation with PEG-1000 and the purity assessed by SDS-PAGE. Fib was conjugated to 25nm colloidal gold by vortexing and the conjugates used as the ligand to identify fib receptors.


Author(s):  
A.M. Jones ◽  
A. Max Fiskin

If the tilt of a specimen can be varied either by the strategy of observing identical particles orientated randomly or by use of a eucentric goniometer stage, three dimensional reconstruction procedures are available (l). If the specimens, such as small protein aggregates, lack periodicity, direct space methods compete favorably in ease of implementation with reconstruction by the Fourier (transform) space approach (2). Regardless of method, reconstruction is possible because useful specimen thicknesses are always much less than the depth of field in an electron microscope. Thus electron images record the amount of stain in columns of the object normal to the recording plates. For single particles, practical considerations dictate that the specimen be tilted precisely about a single axis. In so doing a reconstructed image is achieved serially from two-dimensional sections which in turn are generated by a series of back-to-front lines of projection data.


Sign in / Sign up

Export Citation Format

Share Document