scholarly journals BioSCOOP – Biobank Sample Communication Protocol. New approach for the transfer of information between biobanks

Database ◽  
2019 ◽  
Vol 2019 ◽  
Author(s):  
J Jarczak ◽  
J Lach ◽  
P Borówka ◽  
M Gałka ◽  
M Bućko ◽  
...  

Abstract Dynamic development of biobanking industry (both business and science) resulted in an increased number of IT systems for samples and data management. The most difficult and complicated case for the biobanking community was cooperation between institutions, equipped with different IT systems, in the field of scientific research, mainly data interchange and information flow. Tools available on the market relate mainly to the biobank or collection level. Efficient and universal protocols including the detailed information about the donor and the sample are still very limited. Here, we have developed BioSCOOP, a communication protocol in the form of a well documented JSON API. The main aim of this study was to harmonize and standardize the rules of communication between biobanks on the level of information about the donor together with information about the sample. The purpose was to create a communication protocol for two applications: to transfer the information between different biobanks and to allow the searching and presentation of the sample and data sets.

1980 ◽  
Vol 19 (01) ◽  
pp. 37-41
Author(s):  
R. F. Woolson ◽  
M. T. Tsuang ◽  
L. R. Urban

We are now conducting a forty-year follow-up and family study of 200 schizophrenics, 325 manic-depressives and 160 surgical controls. This study began in 1973 and has continued to the present date. Numerous data handling and data management decisions were made in the course of collecting the data for the project. In this report some of the practical difficulties in the data handling and computer management of such large and bulky data sets are enumerated.


2021 ◽  
pp. 000276422110216
Author(s):  
Kazimierz M. Slomczynski ◽  
Irina Tomescu-Dubrow ◽  
Ilona Wysmulek

This article proposes a new approach to analyze protest participation measured in surveys of uneven quality. Because single international survey projects cover only a fraction of the world’s nations in specific periods, researchers increasingly turn to ex-post harmonization of different survey data sets not a priori designed as comparable. However, very few scholars systematically examine the impact of the survey data quality on substantive results. We argue that the variation in source data, especially deviations from standards of survey documentation, data processing, and computer files—proposed by methodologists of Total Survey Error, Survey Quality Monitoring, and Fitness for Intended Use—is important for analyzing protest behavior. In particular, we apply the Survey Data Recycling framework to investigate the extent to which indicators of attending demonstrations and signing petitions in 1,184 national survey projects are associated with measures of data quality, controlling for variability in the questionnaire items. We demonstrate that the null hypothesis of no impact of measures of survey quality on indicators of protest participation must be rejected. Measures of survey documentation, data processing, and computer records, taken together, explain over 5% of the intersurvey variance in the proportions of the populations attending demonstrations or signing petitions.


Radiocarbon ◽  
2013 ◽  
Vol 55 (2) ◽  
pp. 720-730 ◽  
Author(s):  
Christopher Bronk Ramsey ◽  
Sharen Lee

OxCal is a widely used software package for the calibration of radiocarbon dates and the statistical analysis of 14C and other chronological information. The program aims to make statistical methods easily available to researchers and students working in a range of different disciplines. This paper will look at the recent and planned developments of the package. The recent additions to the statistical methods are primarily aimed at providing more robust models, in particular through model averaging for deposition models and through different multiphase models. The paper will look at how these new models have been implemented and explore the implications for researchers who might benefit from their use. In addition, a new approach to the evaluation of marine reservoir offsets will be presented. As the quantity and complexity of chronological data increase, it is also important to have efficient methods for the visualization of such extensive data sets and methods for the presentation of spatial and geographical data embedded within planned future versions of OxCal will also be discussed.


Geophysics ◽  
2011 ◽  
Vol 76 (4) ◽  
pp. F239-F250 ◽  
Author(s):  
Fernando A. Monteiro Santos ◽  
Hesham M. El-Kaliouby

Joint or sequential inversion of direct current resistivity (DCR) and time-domain electromagnetic (TDEM) data commonly are performed for individual soundings assuming layered earth models. DCR and TDEM have different and complementary sensitivity to resistive and conductive structures, making them suitable methods for the application of joint inversion techniques. This potential joint inversion of DCR and TDEM methods has been used by several authors to reduce the ambiguities of the models calculated from each method separately. A new approach for joint inversion of these data sets, based on a laterally constrained algorithm, was found. The method was developed for the interpretation of soundings collected along a line over a 1D or 2D geology. The inversion algorithm was tested on two synthetic data sets, as well as on field data from Saudi Arabia. The results show that the algorithm is efficient and stable in producing quasi-2D models from DCR and TDEM data acquired in relatively complex environments.


PLoS ONE ◽  
2018 ◽  
Vol 13 (11) ◽  
pp. e0206977 ◽  
Author(s):  
Benjamin L. Walker ◽  
Katherine A. Newhall

2014 ◽  
Vol 1049-1050 ◽  
pp. 2045-2048
Author(s):  
Shi Feng Wu ◽  
Luo Zhong ◽  
Man Li Hu ◽  
Fan Zhou ◽  
Hua Zhu Song

The Supervision System is one of the important components of the Construction Engineering Information Management. This paper proposed the system’s design with the Struts2 and MVC framework based on the android mobile platform. First of all, it gives the E-R model and data design in the system. Then, it gives the overall architecture of the system in which the android client uses MVC framework while the Server-Client uses Struts2 framework that is responsible for the separation of MVC and the jump of the business layer. Besides, for the communication protocol between the Client and the Server, it adopts the HTTP service and Socket service, and uses JSON as a data interchange format. In the end, the UI design of each module in the system is given.


Web Mining ◽  
2011 ◽  
pp. 253-275
Author(s):  
Xiaodi Huang ◽  
Wei Lai

This chapter presents a new approach to clustering graphs, and applies it to Web graph display and navigation. The proposed approach takes advantage of the linkage patterns of graphs, and utilizes an affinity function in conjunction with the k-nearest neighbor. This chapter uses Web graph clustering as an illustrative example, and offers a potentially more applicable method to mine structural information from data sets, with the hope of informing readers of another aspect of data mining and its applications.


2020 ◽  
pp. 751-785
Author(s):  
Preeti Mulay ◽  
Krishnal Patel ◽  
Hecto Gomez Gauchia

Evolving technologies are intricately woven into the fabric of social and institutional systems. With the invent of “Internet of Everything (IoE)” concept it is realistic now to employ animals and or humans to transmit details electronically. IoE concepts with sensor technology can prove wonders in any domain for that matter starting from eFarming, eHealth, eCare and what not. Humans can transform electronics by using various eConnected gadgets also motivated due to or based on “Nature Inspired Algorithms”. The confluence of IT, psychology with non-IT systems will be part of new generation's life. Such collaborative concept can be implemented practically with the help of “Cloud-to-Dew-Computing” based technologies. To include so many concepts together, it is essential to concentrate also on Cyber Security and Risk associated with such conceptual implementation. Dew-Computing at root levels will take care of Cyber Security effectually. Dew-Computing being backend support of Distributed System, can process multiple entities resourcefully. “Animal Data Interchange Standards” are very well considered innovative business opportunity these days and for years to come. These standards have started their work focusing on the Dairy related animal standard. Every dairy animal should enjoy life to remain healthy and more productive. Incremental Learning about Animal Life Data and Animal Identification, behavior, seasonal-changes, health etc. can be easily achieved with IoE.


Author(s):  
Afrand Agah ◽  
Mehran Asadi

This article introduces a new method to discover the role of influential people in online social networks and presents an algorithm that recognizes influential users to reach a target in the network, in order to provide a strategic advantage for organizations to direct the scope of their digital marketing strategies. Social links among friends play an important role in dictating their behavior in online social networks, these social links determine the flow of information in form of wall posts via shares, likes, re-tweets, mentions, etc., which determines the influence of a node. This article initially identities the correlated nodes in large data sets using customized divide-and-conquer algorithm and then measures the influence of each of these nodes using a linear function. Furthermore, the empirical results show that users who have the highest influence are those whose total number of friends are closer to the total number of friends of each node divided by the total number of nodes in the network.


2018 ◽  
Vol 52 (3) ◽  
pp. 28-32 ◽  
Author(s):  
Chris Turner ◽  
Ian Gill

AbstractThe management of oceanographic data is particularly challenging given the variety of protocols for the analysis of data collection and model output, the vast range of environmental conditions studied, and the potentially enormous extent and volume of the resulting data sets and model results. Here, we describe the Research Workspace (the Workspace), a web platform designed around data management best practices to meet the challenges of managing oceanographic data throughout the research life cycle. The Workspace features secure user accounts and automatic file versioning to assist with the early stages of project planning and data collection. Jupyter Notebooks have been integrated into the Workspace to support reproducible numerical analysis and data visualization while making use of high-performance computer resources collocated with data assets. An ISO-compliant metadata editor has also been integrated into the Workspace to support data synthesis, publication, and reuse. The Workspace currently supports stakeholders across the ocean science community, from funding agencies to individual investigators, by providing a data management platform to meet the needs of big ocean data.


Sign in / Sign up

Export Citation Format

Share Document