A Cloud-User Watermarking Protocol Protecting the Right to Be Forgotten for the Outsourced Plain Images

2018 ◽  
Vol 10 (4) ◽  
pp. 118-139
Author(s):  
Xiaojuan Dong ◽  
Weiming Zhang ◽  
Xianjun Hu ◽  
Keyang Liu

This article describes how cloud storage dramatically benefits people in freeing up their local storage space, while bringing the separation of the data ownership and private manipulation. Hence, it is difficult for the cloud user to make sure that the cloud storage provider (CSP) has obeyed the request of deletion to remove all corresponding data. To solve the issue technically, this article proposes an interactive cloud-user watermarking protocol (CUW) based on the homomorphic encryption. To meet security requirements, the encrypted watermark is embedded into encrypted data. Moreover, to enjoy the convenient cloud services, the uploaded data is eventually stored in the cloud server in the form of plain text. The performance of the CUW protocol is evaluated through a prototype implementation.

2017 ◽  
Vol 2017 ◽  
pp. 1-14 ◽  
Author(s):  
Keyang Liu ◽  
Weiming Zhang ◽  
Xiaojuan Dong

With the growth of cloud computing technology, more and more Cloud Service Providers (CSPs) begin to provide cloud computing service to users and ask for users’ permission of using their data to improve the quality of service (QoS). Since these data are stored in the form of plain text, they bring about users’ worry for the risk of privacy leakage. However, the existing watermark embedding and encryption technology is not suitable for protecting the Right to Be Forgotten. Hence, we propose a new Cloud-User protocol as a solution for plain text outsourcing problem. We only allow users and CSPs to embed the ciphertext watermark, which is generated and embedded by Trusted Third Party (TTP), into the ciphertext data for transferring. Then, the receiver decrypts it and obtains the watermarked data in plain text. In the arbitration stage, feature extraction and the identity of user will be used to identify the data. The fixed Hamming distance code can help raise the system’s capability for watermarks as much as possible. Extracted watermark can locate the unauthorized distributor and protect the right of honest CSP. The results of experiments demonstrate the security and validity of our protocol.


The most data intensive industry today is the healthcare system. The advancement in technology has revolutionized the traditional healthcare practices and led to enhanced E-Healthcare System. Modern healthcare systems generate voluminous amount of digital health data. These E-Health data are shared between patients and among groups of physicians and medical technicians for processing. Due to the demand for continuous availability and handling of these massive E-Health data, mostly these data are outsourced to cloud storage. Being cloud-based computing, the sensitive patient data is stored in a third-party server where data analytics are performed, hence more concern about security raises. This paper proposes a secure analytics system which preserves the privacy of patients’ data. In this system, before outsourcing, the data are encrypted using Paillier homomorphic encryption which allows computations to be performed over encrypted dataset. Then Decision Tree Machine Learning algorithm is used over this encrypted dataset to build the classifier model. This encrypted model is outsourced to cloud server and the predictions about patient’s health status is displayed to the user on request. In this system nowhere the data is decrypted throughout the process which ensures the privacy of patients’ sensitive data.


Cloud Computing enables users to use remote resources thus reduces the burden on local storage. However, the use of such services gives rise to new set of problems. The users have no control over the data which they have stored on those storages so to achieve data authentication with confidentiality is utmost important. As every user may not have that expertise so they can request for data verification task to Trusted Verifier (TV) which will be an authorized party to check the intactness of outsourced data. Since the data owner stores the data on the cloud in an encrypted format, it becomes difficult to check the integrity of the data without decrypting. But by using homomorphic encryption schemes the integrity checking can be made possible without original copy. In this paper, we have given implementation and performance details of two homomorphic encryption schemes, Rivest Shamir Adleman (RSA) and Paillier. The RSA is multiplicative homomorphic scheme where the Paillier is additive homomorphic scheme. Both the algorithms are partially homomorphic thus limited in their functions. Due to homomorphic property of these algorithms, original contents will not get revealed in the verification process. This framework will achieve authentication of data by maintaining confidentiality.


2019 ◽  
Vol 3 (3) ◽  
pp. 217
Author(s):  
Irfan Helmi ◽  
Nur Widiyasono ◽  
Rohmat Gunawan

Ease and support for cloud-based data storage It has supported an increase in the number of cloud services. The increasing number of uses for cloud services also increases the number of digital-based criminal actions related to the addition of facilities to cloud services. The cloud service feature designed to store data to support the smooth running of business processes can be misused by criminal assistance to store crime data. Accurate digital evidence is one way to prove a digital crime, which can then be used as supporting evidence in the trial. This study discusses the analysis of digital evidence from a cloud service. The analysis process using the NIST 800-86 method is carried out on digital evidence from 5 previously prepared scenarios related to the use of cloud service features that use being misused. Data acquisition techniques use the method of direct acquisition and physical imaging to obtain digital evidence. The experimental results showed that after scenario 1 and scenario 3, information on the file name and directory of the paths downloaded by client 1 and client 2 were obtained with information on the IP address, mac address, user name, password and time stamp. After scenario 2, digital evidence has been obtained that contains information on the name and location of the folder on the cloud server. After scenario 4, information on the name of the file and the shared folder is successfully obtained, equipped with client information that has the right to access the files and folders. After scenario 5, information about the file name and directory of the file path is successfully obtained.


2018 ◽  
pp. 208-218
Author(s):  
Mohd Rizuan Baharon ◽  
Mohd Faizal Abdollah ◽  
Nur Azman Abu ◽  
Zaheera Zainal Abidin ◽  
Ariff Idris

Video Transcoding is one of the recent services available online nowadays provided by the clouds to enable a user to convert a video format from one into another in a very convenient way. To transcode a video, all of the video contents need to be uploaded to the cloud storage. However, outsourcing video contents that may contain sensitive information do not guarantee the video security and privacy as the clouds have the ability to access them. Thus, in this paper, an enhanced homomorphic encryption scheme is proposed to allow massive amount of frames to be transcoded by the cloud server in a secure manner. This scheme encrypts integers rather than individual bits so as to improve the scheme’s efficiency. With the aid of a proposed process for multiple parties to communicate securely, the efficiency of the scheme is thoroughly evaluated and compared with related works. The result shows that our scheme offers much better efficiency, which makes it more suitable for operating the video transcoding in cloud environment.


2019 ◽  
Vol 15 (10) ◽  
pp. 155014771987899 ◽  
Author(s):  
Changsong Yang ◽  
Xiaoling Tao ◽  
Feng Zhao

With the rapid development of cloud storage, more and more resource-constraint data owners can employ cloud storage services to reduce the heavy local storage overhead. However, the local data owners lose the direct control over their data, and all the operations over the outsourced data, such as data transfer and deletion, will be executed by the remote cloud server. As a result, the data transfer and deletion have become two security issues because the selfish remote cloud server might not honestly execute these operations for economic benefits. In this article, we design a scheme that aims to make the data transfer and the transferred data deletion operations more transparent and publicly verifiable. Our proposed scheme is based on vector commitment (VC), which is used to deal with the problem of public verification during the data transfer and deletion. More specifically, our new scheme can provide the data owner with the ability to verify the data transfer and deletion results. In addition, by using the advantages of VC, our proposed scheme does not require any trusted third party. Finally, we prove that the proposed scheme not only can reach the expected security goals but also can satisfy the efficiency and practicality.


Author(s):  
Xi Vincent Wang ◽  
Lihui Wang

In recent years, Cloud manufacturing has become a new research trend in manufacturing systems leading to the next generation of production paradigm. However, the interoperability issue still requires more research due to the heterogeneous environment caused by multiple Cloud services and applications developed in different platforms and languages. Therefore, this research aims to combat the interoperability issue in Cloud Manufacturing System. During implementation, the industrial users, especially Small- and Medium-sized Enterprises (SMEs), are normally short of budget for hardware and software investment due to financial stresses, but they are facing multiple challenges required by customers at the same time including security requirements, safety regulations. Therefore in this research work, the proposed Cloud manufacturing system is specifically tailored for SMEs.


2020 ◽  
Vol 26 (8) ◽  
pp. 83-99
Author(s):  
Sarah Haider Abdulredah ◽  
Dheyaa Jasim Kadhim

A Tonido cloud server provides a private cloud storage solution and synchronizes customers and employees with the required cloud services over the enterprise. Generally, access to any cloud services by users is via the Internet connection, which can face some problems, and then users may encounter in accessing these services due to a weak Internet connection or heavy load sometimes especially with live video streaming applications overcloud. In this work, flexible and inexpensive proposed accessing methods are submitted and implemented concerning real-time applications that enable users to access cloud services locally and regionally. Practically, to simulate our network connection, we proposed to use the Raspberry-pi3 model B+ as a router wireless LAN (WLAN) that enables users to have the cloud services using different access approaches such as wireless and wireline connections. As a case study for a real-time application over the cloud server, it is suggested to do a live video streaming using an IP webcam and IVIDEON cloud where the streaming video can be accessed via the cloud server at any time with different users taking into account the proposed practical connections. Practical experiments showed and proved that accessing real-time applications of cloud services via wireline and wireless connections is improved by using Tonido cloud server's facilities.


Author(s):  
Linlin Zhang ◽  
Zehui Zhang ◽  
Cong Guan

AbstractFederated learning (FL) is a distributed learning approach, which allows the distributed computing nodes to collaboratively develop a global model while keeping their data locally. However, the issues of privacy-preserving and performance improvement hinder the applications of the FL in the industrial cyber-physical systems (ICPSs). In this work, we propose a privacy-preserving momentum FL approach, named PMFL, which uses the momentum term to accelerate the model convergence rate during the training process. Furthermore, a fully homomorphic encryption scheme CKKS is adopted to encrypt the gradient parameters of the industrial agents’ models for preserving their local privacy information. In particular, the cloud server calculates the global encrypted momentum term by utilizing the encrypted gradients based on the momentum gradient descent optimization algorithm (MGD). The performance of the proposed PMFL is evaluated on two common deep learning datasets, i.e., MNIST and Fashion-MNIST. Theoretical analysis and experiment results confirm that the proposed approach can improve the convergence rate while preserving the privacy information of the industrial agents.


Author(s):  
Tawfiq Barhoom ◽  
Mahmoud Abu Shawish

Despite the growing reliance on cloud services and software, privacy is somewhat difficult. We store our data on remote servers in cloud environments that are untrusted. If we do not handle the stored data well, data privacy can be violated with no awareness on our part. Although it requires expensive computation, encrypting the data before sending it appears to be a solution to this problem. So far, all known solutions to protect textual files using encryption algorithms fell short of privacy expectations. Thus is because encrypting cannot stand by itself. The encrypted data on the cloud server becomes full file in the hand causing the privacy of this data to be intrusion-prone, thus allowing intruders to access the file data once they can decrypt it. This study aimed to develop an effective cloud confidentiality model based on combining fragmentation and encryption of text files to compensate for reported deficiency in encryption methods. The fragmentation method used the strategy of dividing text files into two triangles through the axis. Whereas the encryption method used the Blowfish algorithm. The research concluded that high confidentiality is achieved by building a multi-layer model: encryption, chunk, and fragmentation of every chunk to prevent intruders from reaching the data even if they were able to decrypt the file. Using the privacy accuracy equation (developed for the purpose in this research), the model achieved accuracy levels of 96% and 90% when using 100 and 200 words in each chunk on small, medium, and large files respectively.


Sign in / Sign up

Export Citation Format

Share Document