scholarly journals Latency-Sensitive Data Allocation and Workload Consolidation for Cloud Storage

IEEE Access ◽  
2018 ◽  
Vol 6 ◽  
pp. 76098-76110 ◽  
Author(s):  
Song Yang ◽  
Philipp Wieder ◽  
Muzzamil Aziz ◽  
Ramin Yahyapour ◽  
Xiaoming Fu ◽  
...  
Author(s):  
Song Yang ◽  
Philipp Wieder ◽  
Muzzamil Aziz ◽  
Ramin Yahyapour ◽  
Xiaoming Fu

2005 ◽  
Vol 4 (2) ◽  
pp. 393-400
Author(s):  
Pallavali Radha ◽  
G. Sireesha

The data distributors work is to give sensitive data to a set of presumably trusted third party agents.The data i.e., sent to these third parties are available on the unauthorized places like web and or some ones systems, due to data leakage. The distributor must know the way the data was leaked from one or more agents instead of as opposed to having been independently gathered by other means. Our new proposal on data allocation strategies will improve the probability of identifying leakages along with Security attacks typically result from unintended behaviors or invalid inputs.  Due to too many invalid inputs in the real world programs is labor intensive about security testing.The most desirable thing is to automate or partially automate security-testing process. In this paper we represented Predicate/ Transition nets approach for security tests automated generationby using formal threat models to detect the agents using allocation strategies without modifying the original data.The guilty agent is the one who leaks the distributed data. To detect guilty agents more effectively the idea is to distribute the data intelligently to agents based on sample data request and explicit data request. The fake object implementation algorithms will improve the distributor chance of detecting guilty agents.


Author(s):  
Jiangjiang Wu ◽  
Cong Liu ◽  
Jun Ma ◽  
Yong Cheng ◽  
Jiangchun Ren ◽  
...  

Author(s):  
Peiyi Han ◽  
Chuanyi Liu ◽  
Yingfei Dong ◽  
Hezhong Pan ◽  
QiYang Song ◽  
...  
Keyword(s):  

The most data intensive industry today is the healthcare system. The advancement in technology has revolutionized the traditional healthcare practices and led to enhanced E-Healthcare System. Modern healthcare systems generate voluminous amount of digital health data. These E-Health data are shared between patients and among groups of physicians and medical technicians for processing. Due to the demand for continuous availability and handling of these massive E-Health data, mostly these data are outsourced to cloud storage. Being cloud-based computing, the sensitive patient data is stored in a third-party server where data analytics are performed, hence more concern about security raises. This paper proposes a secure analytics system which preserves the privacy of patients’ data. In this system, before outsourcing, the data are encrypted using Paillier homomorphic encryption which allows computations to be performed over encrypted dataset. Then Decision Tree Machine Learning algorithm is used over this encrypted dataset to build the classifier model. This encrypted model is outsourced to cloud server and the predictions about patient’s health status is displayed to the user on request. In this system nowhere the data is decrypted throughout the process which ensures the privacy of patients’ sensitive data.


This era is of business and marketing, those who have no proper equipments, they can also proceed in this field. This is just because of cloud computing. Many organizations are still in ambivalence whether to adopt cloud computing or not. The biggest barrier before them is security of their sensitive data. This research paper is an attempt to encourage such organizations or individuals who are still thinking to adopt this fruitful as well as cheap and best service. In this paper we have proposed an integrity testing algorithm which tests whole documents at character level before uploading to cloud storage and after downloading from cloud storage. The idea of the algorithm resembles to the idea of ‘error detection and correction’. For more clearance of the idea refer future scope.


Author(s):  
GIRISH SHANKAR ◽  
SOMESHWAR DHAYALAN ◽  
ASHISH ANAND

Leaking of confidential data to an unauthorized agent is a major concern for an organization. In this article we seek to detect the trusted node that leaks the confidential data to an unauthorized agent. Traditionally, leakage of data is handled by water marking technique which requires data modification. If the watermarked copy is found at some unauthorized site then distributor can claim his ownership. But one of the issues with watermarking method is data modification. To overcome the disadvantages of using watermark, data allocation strategies are used to improve the probability of identifying guilty third parties. The idea is to distribute the data intelligently to agents based on sample data request and explicit data request in order to improve the chance of detecting the guilty agents. Modern business activities also rely on extensive email exchange. Email leakages have become widespread, and the severe damage caused by such leakages constitutes a disturbing problem for organizations. Hence, filtering of E-mails is also necessary. This can be done by blocking E-mails which contains images, videos or sensitive data and filtering the text file of an organization.


2019 ◽  
Vol 9 (4) ◽  
pp. 1-20
Author(s):  
Syam Kumar Pasupuleti

Cloud storage allows users to store their data in the cloud to avoid local storage and management costs. Since the cloud is untrusted, the integrity of stored data in the cloud has become an issue. To address this problem, several public auditing schemes have been designed to verify integrity of the data in the cloud. However, these schemes have two drawbacks: public auditing may reveal sensitive data to verifier and does not address the data recovery problem efficiently. This article proposes a new privacy-preserving public auditing scheme with data dynamics to secure the data in the cloud based on an exact regenerated code. This scheme encodes the data for availability, then masks the encoded blocks with randomness for privacy of data and enables a public auditor to verify the integrity of the data. Further, this scheme also supports dynamic data updates. In addition, security and performance analysis proves that proposed scheme is provably secure and efficient.


Author(s):  
Syam Kumar Pasupuleti

Cloud storage allows users to store their data in the cloud to avoid local storage and management costs. Since the cloud is untrusted, the integrity of stored data in the cloud has become an issue. To address this problem, several public auditing schemes have been designed to verify integrity of the data in the cloud. However, these schemes have two drawbacks: public auditing may reveal sensitive data to verifier and does not address the data recovery problem efficiently. This article proposes a new privacy-preserving public auditing scheme with data dynamics to secure the data in the cloud based on an exact regenerated code. This scheme encodes the data for availability, then masks the encoded blocks with randomness for privacy of data and enables a public auditor to verify the integrity of the data. Further, this scheme also supports dynamic data updates. In addition, security and performance analysis proves that proposed scheme is provably secure and efficient.


Sign in / Sign up

Export Citation Format

Share Document