scholarly journals Integration of the Java Image Science Toolkit with E-Science Platform

2016 ◽  
Author(s):  
Stephen Damon ◽  
Sahil Panjwani ◽  
Shunxing Bao ◽  
Peter Kochunov ◽  
Bennett Landman

Medical image analyses rely on diverse software packages assembled into a “pipeline”. The Java Image Science Toolkit (JIST) has served as a standalone plugin into the Medical Image Processing Analysis and Visualization (MIPAV). We addressed shortcomings that previously prevented deeper integration of JIST with other E-science platforms. First, we developed an interface for integrating externally compiled packages (similar to the interfaces in NiPy) such that the application can become a “draggable module” in the module tree. This allows for connection of inputs and outputs to other JIST modules while maintaining external processing and monitoring. Second, we develop an integration interface with the Neuroimaging Informatics Tools and Resources Clearinghouse Cloud Environment (NITRC-CE). User can launch and terminate pre-configured nodes to utilize computational resources of the Amazon cloud. Finally, we define a new external data source, which can connect to the eXtensible Neuroimaging Archive Toolkit (XNAT) to query and retrieve remote data using XNAT’s REST API. Specifically, we define dataflow for files that can readily be converted into volumes and collections of volumes to interface with any JIST module that expects volumetric image data as an input. Users now have the ability to run their pipelines from a well-defined external data source and no longer are required to already have data on the disk. With these upgrades we have extended JIST’s capabilities outside of complied java source code and enhanced capabilities to seamlessly interface with E-science platforms.

Author(s):  
Robert Wrembel

A data warehouse architecture (DWA) has been developed for the purpose of integrating data from multiple heterogeneous, distributed, and autonomous external data sources (EDSs) as well as for providing means for advanced analysis of integrated data. The major components of this architecture include: an external data source (EDS) layer, and extraction-transformation-loading (ETL) layer, a data warehouse (DW) layer, and an on-line analytical processing (OLAP) layer. Methods of designing a DWA, research developments, and most of the commercially available DW technologies tacitly assumed that a DWA is static. In practice, however, a DWA requires changes among others as the result of the evolution of EDSs, changes of the real world represented in a DW, and new user requirements. Changes in the structures of EDSs impact the ETL, DW, and OLAP layers. Since such changes are frequent, developing a technology for handling them automatically or semi-automatically in a DWA is of high practical importance. This chapter discusses challenges in designing, building, and managing a DWA that supports the evolution of structures of EDSs, evolution of an ETL layer, and evolution of a DW. The challenges and their solutions presented here are based on an experience of building a prototype Evolving-ETL and a prototype Multiversion Data Warehouse (MVDW). In details, this chapter presents the following issues: the concept of the MVDW, an approach to querying the MVDW, an approach to handling the evolution of an ETL layer, a technique for sharing data between multiple DW versions, and two index structures for the MVDW.


2018 ◽  
Vol 7 (4.38) ◽  
pp. 908
Author(s):  
Siska P. Yudowati ◽  
Andry Alamsyah

An audit of financial report is a review of the organization financial statement that carried out by the independent and professional in their field which is the Auditor. The Big Data methodology offered a different approach compared to the current audit procedure, which mostly using manual process. Big Data equipped with learning capabilities and automation process in order to achieve better and faster result. Another advantage of using Big Data methodology is to provide comprehensive and multi-dimensional view of the problem. This paper provides a framework to incorporated auditing process and Big Data approach, specifically by mapping the internal and external data source to one of audit process stage, which is risk assessment process.  


2020 ◽  
Vol 15 (89) ◽  
pp. 124-136
Author(s):  
Emil A. Gumerov ◽  
◽  
Tamara V. Alekseeva ◽  

Oracles programs accept information from various sources, transform it, and transmit it to smart contracts. They can also accept data from a smart contract and transmit it to an external data source. Ensuring the security, validity and integrity of the supplied data determines the success of the blockchain system, therefore, the research topic is relevant. The purpose of this article is to identify practically important features of Oracle programs and develop a version of the information system architecture for Oracles programs that meets the necessary requirements. The authors were faced with the task of investigating all the vulnerabilities associated with the use of Oracle programs and developing an optimal architectural solution. In the course of research, methods of reviewing scientific literature on the subject of research, collecting, structuring and analyzing the information received, and methods of choosing solutions were used. As a result of the research, the concept of an intelligent system for transferring external data to a blockchain management system is proposed and the optimal architecture of this intelligent system is developed. This solution is aimed at improving the security of using Oracle programs for blockchain management systems, especially blockchain management systems for industrial Internet of things applications. The solution can be used by developers of distributed registry systems to effectively launch and implement projects.


2021 ◽  
Vol 14 ◽  
Author(s):  
Eric Nathan Carver ◽  
Zhenzhen Dai ◽  
Evan Liang ◽  
James Snyder ◽  
Ning Wen

Every year thousands of patients are diagnosed with a glioma, a type of malignant brain tumor. MRI plays an essential role in the diagnosis and treatment assessment of these patients. Neural networks show great potential to aid physicians in the medical image analysis. This study investigated the creation of synthetic brain T1-weighted (T1), post-contrast T1-weighted (T1CE), T2-weighted (T2), and T2 Fluid Attenuated Inversion Recovery (Flair) MR images. These synthetic MR (synMR) images were assessed quantitatively with four metrics. The synMR images were also assessed qualitatively by an authoring physician with notions that synMR possessed realism in its portrayal of structural boundaries but struggled to accurately depict tumor heterogeneity. Additionally, this study investigated the synMR images created by generative adversarial network (GAN) to overcome the lack of annotated medical image data in training U-Nets to segment enhancing tumor, whole tumor, and tumor core regions on gliomas. Multiple two-dimensional (2D) U-Nets were trained with original BraTS data and differing subsets of the synMR images. Dice similarity coefficient (DSC) was used as the loss function during training as well a quantitative metric. Additionally, Hausdorff Distance 95% CI (HD) was used to judge the quality of the contours created by these U-Nets. The model performance was improved in both DSC and HD when incorporating synMR in the training set. In summary, this study showed the ability to generate high quality Flair, T2, T1, and T1CE synMR images using GAN. Using synMR images showed encouraging results to improve the U-Net segmentation performance and shows potential to address the scarcity of annotated medical images.


Ideally, secure transmission of medical image data is one of the major challenges in health sector. The National Health Information Network has to protect the data in confidential manner. Storage is also one of the basic concern along with secure transmission. In this paper we propose an algorithm that supports confidentiality, authentication and integrity implementation of the scrambled data before transmitting on the communication medium. Before communication the data is compressed while keeping data encrypted. The research work demonstrate with simulation results. The results shows that the proposed work effectively maintains confidentiality, authentication and integrity. The experimental results evaluated medical image quality like PSNR, MSE, SC, and NAEetc.


Author(s):  
T. Hu ◽  
J. Fan ◽  
H. He ◽  
L. Qin ◽  
G. Li

To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.


Sign in / Sign up

Export Citation Format

Share Document