A Distributed Data Replication Protocol for File Versioning with Optimal Node Assignments

Author(s):  
Takahiko Ikeda ◽  
Mamoru Ohara ◽  
Satoshi Fukumoto ◽  
Masayuki Arai ◽  
Kazuhiko Iwasaki
2006 ◽  
Vol 3 (2) ◽  
pp. 274-281 ◽  
Author(s):  
Andreas Kaps ◽  
Konstantin Dyshlevoi ◽  
Klaus Heumann ◽  
Ralf Jost ◽  
Ioannis Kontodinas ◽  
...  

Summary Modern academic and industrial research in life sciences generates huge amounts of data and information. To extract knowledge from this information space, optimized integration and retrieval software tools are essential. In the last years, a number of academic as well as commercial systems have been developed to solve this problem. However, as scientific projects are distributed at different locations (e.g., subsidiaries of companies, academic partnerships), data exchange and availability must be realized in a way that avoids data replication.In this article, we describe a global solution for integrating distributed information by applying the BioRSTM Integration and Retrieval System and its inter-BioRS communication capability that goes beyond the standard issue of local data integration. Each site integrates and maintains locally generated data using a local copy of the BioRS software. Applying the inter-BioRS communication, all available BioRS instances can communicate with each other realizing a global network of integrated databanks. All databanks integrated in this network can be accessed from any site without any data replication. This open system allows the addition of new information and sites dynamically. However, access privileges for certain databanks can be maintained on a per user and databank level ensuring data security when required.


Sensors ◽  
2018 ◽  
Vol 18 (2) ◽  
pp. 547 ◽  
Author(s):  
Junyu Zhu ◽  
Chuanhe Huang ◽  
Xiying Fan ◽  
Sipei Guo ◽  
Bin Fu

Author(s):  
Umesh Banodha ◽  
Praveen Kumar Kataria

Cloud is an emerging technology that stores the necessary data and electronic form of data is produced in gigantic quantity. It is vital to maintain the efficacy of this data the need of data recovery services is highly essential. Cloud computing is anticipated as the vital foundation for the creation of IT enterprise and it is an impeccable solution to move databases and application software to big data centers where managing data and services is not completely reliable. Our focus will be on the cloud data storage security which is a vital feature when it comes to giving quality service. It should also be noted that cloud environment comprises of extremely dynamic and heterogeneous environment and because of high scale physical data and resources, the failure of data centre nodes is completely normal.Therefore, cloud environment needs effective adaptive management of data replication to handle the indispensable characteristic of the cloud environment. Disaster recovery using cloud resources is an attractive approach and data replication strategy which attentively helps to choose the data files for replication and the strategy proposed tells dynamically about the number of replicas and effective data nodes for replication. Thus, the objective of future algorithm is useful to help users together the information from a remote location where network connectivity is absent and secondly to recover files in case it gets deleted or wrecked because of any reason. Even, time oriented problems are getting resolved so in less time recovery process is executed.


2005 ◽  
Vol 4 (2) ◽  
pp. 393-400
Author(s):  
Pallavali Radha ◽  
G. Sireesha

The data distributors work is to give sensitive data to a set of presumably trusted third party agents.The data i.e., sent to these third parties are available on the unauthorized places like web and or some ones systems, due to data leakage. The distributor must know the way the data was leaked from one or more agents instead of as opposed to having been independently gathered by other means. Our new proposal on data allocation strategies will improve the probability of identifying leakages along with Security attacks typically result from unintended behaviors or invalid inputs.  Due to too many invalid inputs in the real world programs is labor intensive about security testing.The most desirable thing is to automate or partially automate security-testing process. In this paper we represented Predicate/ Transition nets approach for security tests automated generationby using formal threat models to detect the agents using allocation strategies without modifying the original data.The guilty agent is the one who leaks the distributed data. To detect guilty agents more effectively the idea is to distribute the data intelligently to agents based on sample data request and explicit data request. The fake object implementation algorithms will improve the distributor chance of detecting guilty agents.


Author(s):  
D. V. Gribanov

Introduction. This article is devoted to legal regulation of digital assets turnover, utilization possibilities of distributed computing and distributed data storage systems in activities of public authorities and entities of public control. The author notes that some national and foreign scientists who study a “blockchain” technology (distributed computing and distributed data storage systems) emphasize its usefulness in different activities. Data validation procedure of digital transactions, legal regulation of creation, issuance and turnover of digital assets need further attention.Materials and methods. The research is based on common scientific (analysis, analogy, comparing) and particular methods of cognition of legal phenomena and processes (a method of interpretation of legal rules, a technical legal method, a formal legal method and a formal logical one).Results of the study. The author conducted an analysis which resulted in finding some advantages of the use of the “blockchain” technology in the sphere of public control which are as follows: a particular validation system; data that once were entered in the system of distributed data storage cannot be erased or forged; absolute transparency of succession of actions while exercising governing powers; automatic repeat of recurring actions. The need of fivefold validation of exercising governing powers is substantiated. The author stresses that the fivefold validation shall ensure complex control over exercising of powers by the civil society, the entities of public control and the Russian Federation as a federal state holding sovereignty over its territory. The author has also conducted a brief analysis of judicial decisions concerning digital transactions.Discussion and conclusion. The use of the distributed data storage system makes it easier to exercise control due to the decrease of risks of forge, replacement or termination of data. The author suggests defining digital transaction not only as some actions with digital assets, but also as actions toward modification and addition of information about legal facts with a purpose of its establishment in the systems of distributed data storage. The author suggests using the systems of distributed data storage for independent validation of information about activities of the bodies of state authority. In the author’s opinion, application of the “blockchain” technology may result not only in the increase of efficiency of public control, but also in the creation of a new form of public control – automatic control. It is concluded there is no legislation basis for regulation of legal relations concerning distributed data storage today.


Sign in / Sign up

Export Citation Format

Share Document