scholarly journals MULTI-CHANNEL MAGNETOCARDIOGRAPH: CONTROL AND SOFTWARE

2014 ◽  
pp. 120-124
Author(s):  
Igor Voytovych ◽  
Myhailo Primin ◽  
Valery Vasyliev ◽  
Pavlo Sutkovyy ◽  
Mykola Budnyk ◽  
...  

Purpose of the work was to present algorithms and software developed for working with multi-channel magneto-cardiograph. Such software is intended for control of operation and computer processing of magnetocardiographic (MCG) data obtained from the human heart. Magnetocardiograph is controlled as “virtual device” from PC mouse/keyboard, so as manually from electronic unit and all that through the control microprocessors embedded into hardware units. Software processing is performed both in on-line mode during process of data acquisition and in off-line manner during post-processing. Software allows preliminary processing, reconstruction and analysis of magnetic maps, and also inverse problem solution. Package is intended both for scientific studying of the heart electric activity and studying of MCG informative indexes for clinical diagnostic of cardiology diseases. Above software is planned to use at Strazhesko Institute of Cardiology (Kyiv) within framework of project supported by Science and Technology Center in Ukraine (STCU).

Author(s):  
M.N. Ustinin ◽  
Yu.V. Maslennikov ◽  
S.D. Rykunov ◽  
V.A. Krymov

The new method of magnetocardiography data analysis is proposed. The method is based on the Fourier transform of prolonged time series and on the massive inverse problem solution for all spectral components. Magnetocardiograms (MCG) were registered in the plane above the subject’s chest in the nodes of the “rectangular” (6×6) grid with the step 40 mm at usual laboratory conditions without any additional magnetic shielding. The 9-channel MCG-system “MAG-SCAN-09” with dc-SQUID-based axial second order gradiometers was used. The MCG-recording was performed in four positions of investigated subjects under the instrument to get all 36 MCGs. For each of four positions of the MCG-recording the partial functional tomogram was calculated, which is the spatial distribution of elementary magnetic dipoles, observed in this position. The complete functional tomogram of the thorax was obtained by the summation of four partial functional tomograms, containing the data about the same object, observed from various positions. Filtering and contrasting of the complete functional tomogram made it possible to extract the 3D-object, representing the functional structure of the heart. The method was used for five subjects and provided consistent results. It is possible to use this method in cardiography, because the functional tomogram contains all measured information about individual heart.


2021 ◽  
Author(s):  
huseyin ozgur kazanci

Abstract Diffuse optic imaging is an important biomedical optic research tool. Diffuse optic tomography (DOT) modality needs progressive philosophical approaches for scientific contribution. Technological developments and philosophical approaches should both go forward. Phase-shift based frequency domain (FD) diffuse optical tomography (FDDOT) method was well established in the literature. The instruments were tested for brain neurofunctional imaging. A mixture of AC laser intensity and phase data were used at these works. According to those works; deep volume resolution was improved by only using phase data. Because phase data is only related to the photon mean free path in imaging tissue media. Besides this advantage, laser intensity data is also affected by noisy background light and electrical artifacts. Another most important advantage of only using phase data can be explained as time-resolved temporal change can be directly related to phase shift of modulated frequency source. At this work, the frequency domain (FD) DOT imaging method which uses phase shift data were used for simulation phantom. Laser source-driven forward model problem weight matrix simulation data was given to the simple pseudo-inverse-based inverse problem solution algorithm for one inclusion example. The inclusion image was reconstructed and demonstrated successfully. Forward model problem weight functions inside the tissue simulation media were calculated and used based on the phase shifts at the same core modulation frequency. 100 MHz modulation frequency was selected due to its FDDOT standard. 13 sources and 13 detectors were placed on the back-reflected imaging surface. 40 x, y, z cartesian coordinate grid elements were used in the image reconstruction algorithm. Photon absorption coefficient: ma = 0.1 cm-1, and scattering coefficient: ms = 100 cm-1 values were set for background simulation phantom. One inclusion object was embedded inside the imaging tissue simulation phantom background. x, y, z cartesian coordinate grid sizes were selected for 100 mm for each direction. Photon phase shift fluencies were added to the forward model problem. The forward model problem was built according to the frequency domain photon migration diffusion approximation. Forward model problem photon fluencies were calculated according to the diffusion equation approximation. The simple pseudoinverse mathematical inverse problem solution algorithm was applied to test the results. The embedded inclusion object was reconstructed successfully with the high-resolution image quality. In general, DOT techniques suffer for the low image quality, but in this work, the high-quality image was reconstructed and demonstrated. The philosophical approach has future promising DOT imaging capability. The phase shift version of the FDDOT modality has an important advantage for future purpose.


Dependability ◽  
2018 ◽  
Vol 18 (4) ◽  
pp. 3-9 ◽  
Author(s):  
I. S. Shubinsky ◽  
A. M. Zamyshliaev ◽  
L. P. Papi

The paper examines the reliability of an information management system as its ability to provide the required services that can be justifiably trusted. It is assumed that the system functions without an operator. The aim is to ensure the dependability of a multimodule control system, when the problem-solving results are affected by failures, faults and errors of problem-solution by the system’s computation modules (CMs). Conventional fault tolerance methods do not provide the desired effect, as even under infinite structural redundancy yet real capabilities of on-line detection of CM failures or faults the system’s dependability is significantly lower than expected. The paper proposes and evaluates the methods of adaptive dependability. They are to ensure the observability of control systems under limited capabilities of component CM operability supervision, as well as achieving the required levels of dependability of information management systems in cases of insignificant float time and structural redundancy. These goals are achieved through active (and automatic) reassignment of the available computational resources for on-line information processing. The methods of adaptive dependability enable – with no interruption of computational processes and while solving real-world problems – timely automatic detection and elimination of failures, faults of CMs and errors in the solution of specified problems through on-line localization of faulty modules and subsequent automatic reconfiguration of the system with the elimination of such modules from operation.


Author(s):  
Wit Stryczniewicz ◽  
Janusz Zmywaczyk ◽  
Andrzej Jaroslaw Panas

Purpose The paper aims to discuss the inverse heat conduction methodology in solution of a certain parameter identification problem. The problem itself concerns determination of the thermophysical properties of a thin layer coating by applying the laser flash apparatus. Design/methodology/approach The modelled laser flash diffusivity data from the three-layer sample investigation are used as input for the following parameter estimation procedure. Assuming known middle layer, i.e. substrate properties, the thermal diffusivity (TD) of the side layers’ material is determined. The estimation technique utilises the finite element method for numerical solution of the direct, 2D axisymmetric heat conduction problem. Findings The paper presents methodology developed for a three-layer sample studies and results of the estimation technique testing and evaluation based on simulated data. The multi-parametrical identification procedure results in identification of the out of plane thin layer material diffusivity from the inverse problem solution. Research limitations/implications The presentation itself is limited to numerical simulation data, but it should be underlined that the flake graphite thermophysical parameters have been utilised in numerical tests. Practical implications The developed methodology is planned to be applied in detailed experimental studies of flake graphite. Originality/value In the course of a present study, a methodology of the thin-coating layer TD determination was developed. In spite of the fact that it has been developed for the graphite coating investigation, it was planned to be universal in application to any thin–thick composite structure study.


Sign in / Sign up

Export Citation Format

Share Document