radiograph image
Recently Published Documents


TOTAL DOCUMENTS

31
(FIVE YEARS 6)

H-INDEX

5
(FIVE YEARS 0)

Author(s):  
Mrs Tejaswini ML ◽  
Ashwni H ◽  
Chandana N ◽  
Harshitha BR ◽  
Nagashree HN

A coronavirus have a great impact on a public health globally. Real time PCR s used for pathological testing but that result in false test result this impact made to exploration of alternate method for testing [1]. The detection of coronavirus 2 using chest X-ray image is anlifesaving property. By using chest X-ray coronavirus can identified are cost effective and its available on every public health sector rural clinic hospital. Deep learning –based chest radiograph classification (DL-CRC) frame are used to distinguish the COVID-19 cases and normal cases will high accuracy. The pre-trained image database used for large training sets to have pre- trained weights .The training data consisting covid chest X-ray image and normal chest X-ray image and fed into customized convolution neural network (CNN) model in DL-CRC wear masks in public areas is a major protection for people .The classification implies that it can efficiently detection COVID-19 from radiograph image for provide a reliable and fast response of COVID-19 infection in the lung.


Author(s):  
José Luis LÓPEZ-RAMÍREZ ◽  
Enrique CALDERÓN-SASTRE ◽  
Joel QUINTANILLA-DOMÍNGUEZ ◽  
José Gabriel AGUILERA-GONZÁLEZ

Cephalometric analysis is a study held in orthodontics, based on the identification of certain points in a skull image obtained through an X-ray image or another method in medical imaging. The indicated points are compared with standard values to evaluate and diagnose the patient. The radiograph’s labeling is regularly performed by hand, which makes the labeling process slow and prone to errors due to the visual acuity required. This approach is not much reproducible, because it relies on the domain and expertise of the expert labeler. Many machine learning methods were successfully applied to solve medical imaging tasks, aiming to reduce the health experts’ workload and emit more accurate diagnoses in less time and, avoid a more several clinical case. This work shows the design and development process of a machine learning system based on convolutional neural networks to identify 19 cephalometric landmarks for a lateral skull radiograph image as input. The system used a 400 labeled images dataset, from which, 150 were used for training, 150 for model’s validation and it was tested in the 100 remaining images.


2021 ◽  
Vol 5 (1) ◽  
pp. 1
Author(s):  
Merry Annisa Damayanti ◽  
Suhardjo Sitam ◽  
Bambang Hidayat ◽  
Ivhatry Rizky Octavia Putri Susilo

Objectives: The study assesses periapical radiograph image with various android based analysis method to detect granuloma. Materials and Methods: The study uses survey descriptive cross sectional by using questionnaire. The questionnaire is distributed to 70 random respondents. The methods of the android application used are BLOB (Binary Large Object), DCT and LDA (Discrete Cosine Transform and Linier Discriminant Analysis), DWT and PCA (Discrete Wavelet Transform & Principal Component Analysis), and multiwavelet transformation. The questionnaire assessment included accuracy, effectiveness, attractiveness, innovativeness of the android application. Results: Android application with BLOB has effectivity and accuracy of 62,5%, attractiveness and innovativeness of 75%. Android application with DCT and LDA has effectivity and accuracy of 50 %, attractiveness of 70% and innovativeness of 80%. Android application with DWT and PCA has effectivity of 50%, accuracy of 60%, attractiveness of 66,66% and innovativeness of 80%. Android application with multiwavelet transformation has effectivity and accuracy of 50%, attractiveness of 55% and innovativeness of 73%. Conclusion: Based on assessment, the four methods used to detect granuloma are effective and applicative with android-based application. Android-based Application can detect granuloma with approximately more than 70% successful rate. These methods ease the practitioner to interpret the granuloma image.


2021 ◽  
Vol 2021 ◽  
pp. 1-7
Author(s):  
Xibai Li ◽  
Yan Sun ◽  
Juyang Jiao ◽  
Haoyu Wu ◽  
Chunxi Yang ◽  
...  

The aim of the present study is to build a software implementation of a previous study and to diagnose discoid lateral menisci on knee joint radiograph images. A total of 160 images from normal individuals and patients who were diagnosed with discoid lateral menisci were included. Our software implementation includes two parts: preprocessing and measurement. In the first phase, the whole radiograph image was analyzed to obtain basic information about the patient. Machine learning was used to segment the knee joint from the original radiograph image. Image enhancement and denoising tools were used to strengthen the image and remove noise. In the second phase, edge detection was used to quantify important features in the image. A specific algorithm was designed to build a model of the knee joint and measure the parameters. Of the test images, 99.65% were segmented correctly. Furthermore, 97.5% of the tested images were segmented correctly and their parameters were measured successfully. There was no significant difference between manual and automatic measurements in the discoid ( P = 0.28 ) and control groups ( P = 0.15 ). The mean and standard deviations of the ratio of lateral joint space distance to the height of the lateral tibial spine were compared with the results of manual measurement. The software performed well on raw radiographs, showing a satisfying success rate and robustness. Thus, it is possible to diagnose discoid lateral menisci on radiographs with the help of radiograph-image-analyzing software (BM3D, etc.) and artificial intelligence-related tools (YOLOv3). The results of this study can help build a joint database that contains data from patients and thus can play a role in the diagnosis of discoid lateral menisci and other knee joint diseases in the future.


Author(s):  
BANYU BIRU ◽  
HILMAN FAUZI ◽  
FAHMI OSCANDAR

ABSTRAKOdontologi forensik merupakan sebuah cabang ilmu forensik yang melakukan proses identifikasi berdasarkan gigi. Gigi merupakan salah satu bagian tubuh manusia paling kuat kuat. Dalam masa pertumbuhan, gigi manusia mengalami degeneratif pada usia tertentu, sehingga gigi dapat menjadi media dalam proses identifikasi usia. Pada penelitian ini, dirancang sistem pengolahan citra yang dapat mendeteksi usia manusia pada citra radiograf panoramik gigi. Sistem ini menggunakan metode Binary Large Object dan Decision Tree. Berdasarkan hasil pengujian, sistem dapat mendeteksi usia berdasarkan citra gigi molar pertama dengan tingkat akurasi lebih dari 80%, pada saat menggunakan parameter structuring element jenis Disk dengan jari-jari 4 piksel, ciri area dan rasio pulpa, serta jenis algoritma pada decision tree yaitu curvature dengan jumlah 50 percabangan.Kata kunci: citra radiograf panoramik, pulpa gigi, molar pertama, decision tree, binary large object ABSTRACTForensic odontology is a branch of forensic science that carries out dental identification processes. Teeth are one of the strongest parts of the human body In the period of growth, human teeth degenerative at a certain age, so that teeth can be a medium in the process of age identification. In this study, an image processing system was designed that could detect human age on dental panoramic radiographs. This system using the Binary Large Object and Decision Tree methods. Based on the test results, the system can detect age based on the image of the first molar with an accuracy level of more than 80%, when using a Disk type structuring element parameter with a radius of 4 pixels, the area and pulp ratio features, and the type of algorithm in the decision tree, namely curvature with the number of 50 branches.Keywords: panoramic radiograph image, teeth pulp, first molar, decision tree,binary large object


2020 ◽  
Vol 127 ◽  
pp. 104092
Author(s):  
Krzysztof Misztal ◽  
Agnieszka Pocha ◽  
Martyna Durak-Kozica ◽  
Michał Wątor ◽  
Aleksandra Kubica-Misztal ◽  
...  

Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Vineet Raghu ◽  
jakob weiss ◽  
Udo Hoffmann ◽  
Hugo Aerts ◽  
Michael T Lu

Introduction: Chronological age is a well-known risk factor for cardiovascular disease, but measures of vascular age may enable more personalized care. We hypothesize that a convolutional neural network (CNN) can assess vascular age from a chest radiograph image. Methods: The CNN model, CXR-Age, was developed using data from over 100,000 indviduals from publicly available cohorts and was validated in 1) a subset of the Prostate, Lung, Colorectal, and Ovarian Cancer screening trial’s chest x-ray arm (PLCO, N = 40,967) and 2) the chest radiograph arm of the National Lung Screening Trial (NLST, N = 5,414). The primary outcome was 13-year cardiovascular mortality defined by ICD9 codes for ischemic heart disease, myocardial infarction, and stroke. Results are provided for independent testing datasets only. Results: After adjusting for sex, a 5-year increase in CXR-Age was a better predictor of cardiovascular mortality than a 5-year increase in chronological age in PLCO (CXR-Age aHR 2.69 per 5 years [95% CI 2.55-2.84] vs. chronological age aHR 1.84 per 5 years [95% CI 1.75-1.93], p < 0.001) and NLST (CXR-Age aHR 2.06 per 5-years [95% CI, 1.78-2.39] vs. chronological age aHR 1.64 per 5 years [95% CI, 1.44-1.86], p = 0.06). This association with cardiovascular mortality was robust to adjustment for baseline cardiovascular risk factors (chronological age, sex, diabetes, hypertension, smoking) in PLCO (CXR-Age aHR 1.58 per 5 years [95% CI, 1.54-1.63], p < 0.001) and NLST (CXR-Age aHR 1.48 per 5 years [95% CI, 1.36-1.61], p < 0.001). Kaplan-Meier curves (Figure 1) stratified by chronological age groups show CXR-Age has a graded association with cardiovascular mortality in individuals with similar baseline chronological age. Conclusions: A CNN model, CXR-Age, can assess vascular age from a chest radiograph image, and this CXR-Age predicts cardiovascular mortality better than chronological age.


Sign in / Sign up

Export Citation Format

Share Document