scholarly journals DATA SECURITY SYSTEM FOR THE WINMAP GIS

Author(s):  
Заставной ◽  
Dmitriy Zastavnoy

A key feature for the information systems is data security, but the Geoinformation Systems and Spatial Databases as well as their applications appear to have some drawbacks concerning that matter. Most suggestions about geodata confidentiality are obviously stuck in their attempts to link access rules with geometric properties of spatial data. In this paper we suggest a different approach toward build data access model and a complete data security system for a WinMAP system which includes account control, data access control based on an extended DAC and audit features. A data model of WinMAP is also described because its specialized features allow to rationally develop and effectively implement the data security system. An implementation of the extended DAC model is briefly sketched.

2016 ◽  
Vol 23 (3) ◽  
pp. 178-182
Author(s):  
Andrzej Zygmuniak ◽  
Violetta Sokoła-Szewioła

Abstract This study is aimed at exposing differences between two data models in case of code lists values provided there. The first of them is an obligatory one for managing Geodesic Register of Utility Networks databases in Poland [9] and the second is the model originating from the Technical Guidelines issued to the INSPIRE Directive. Since the second one mentioned is the basis for managing spatial databases among European parties, correlating these two data models has an effect in easing the way of harmonizing and, in consequence, exchanging spatial data. Therefore, the study presents the possibilities of increasing compatibility between the values of the code lists concerning attributes for objects provided in both models. In practice, it could lead to an increase of the competitiveness of entities managing or processing such databases and to greater involvement in scientific or research projects when it comes to the mining industry. Moreover, since utility networks located on mining areas are under particular protection, the ability of making them more fitted to their own needs will make it possible for mining plants to exchange spatial data in a more efficient way.


2020 ◽  
Vol 1 ◽  
pp. 1-23
Author(s):  
Majid Hojati ◽  
Colin Robertson

Abstract. With new forms of digital spatial data driving new applications for monitoring and understanding environmental change, there are growing demands on traditional GIS tools for spatial data storage, management and processing. Discrete Global Grid System (DGGS) are methods to tessellate globe into multiresolution grids, which represent a global spatial fabric capable of storing heterogeneous spatial data, and improved performance in data access, retrieval, and analysis. While DGGS-based GIS may hold potential for next-generation big data GIS platforms, few of studies have tried to implement them as a framework for operational spatial analysis. Cellular Automata (CA) is a classic dynamic modeling framework which has been used with traditional raster data model for various environmental modeling such as wildfire modeling, urban expansion modeling and so on. The main objectives of this paper are to (i) investigate the possibility of using DGGS for running dynamic spatial analysis, (ii) evaluate CA as a generic data model for dynamic phenomena modeling within a DGGS data model and (iii) evaluate an in-database approach for CA modelling. To do so, a case study into wildfire spread modelling is developed. Results demonstrate that using a DGGS data model not only provides the ability to integrate different data sources, but also provides a framework to do spatial analysis without using geometry-based analysis. This results in a simplified architecture and common spatial fabric to support development of a wide array of spatial algorithms. While considerable work remains to be done, CA modelling within a DGGS-based GIS is a robust and flexible modelling framework for big-data GIS analysis in an environmental monitoring context.


Author(s):  
Menglong Yan ◽  
Yong Gao ◽  
Lun Wu ◽  
Pengfei Wu ◽  
Yong Zhao ◽  
...  

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Xiaodan Chen ◽  
Desheng Zeng ◽  
Shuanglong Pang ◽  
Fu Jun

In order to improve data security, ensure user privacy, and solve the problems of low data access control accuracy, long time consumption, and high energy consumption in traditional methods, a cloud computing storage data access control method based on dynamic re-encryption is proposed. The principal component analysis method is used to reduce the dimension of the cloud computing storage data, and the random forest algorithm is further used to classify and process the cloud computing storage data according to the processing results. On the basis of data preprocessing, an access control tree is established to obtain the correlation of data nodes. Finally, the dynamic re-encryption method is used for data security state transformation, and the data access control of cloud computing storage is realized through key generation, encryption, re-encryption key generation, and decryption. The experimental results show that the data access control accuracy of the method in this paper is high, time consumption is small, and energy consumption is small, and it is more suitable for cloud computing systems with huge data and information.


Electronics ◽  
2021 ◽  
Vol 10 (24) ◽  
pp. 3135
Author(s):  
Mohammed Alshehri ◽  
Brajendra Panda ◽  
Sultan Almakdi ◽  
Abdulwahab Alazeb ◽  
Hanan Halawani ◽  
...  

The world has experienced a huge advancement in computing technology. People prefer outsourcing their confidential data for storage and processing in cloud computing because of the auspicious services provided by cloud service providers. As promising as this paradigm is, it creates issues, including everything from data security to time latency with data computation and delivery to end-users. In response to these challenges, the fog computing paradigm was proposed as an extension of cloud computing to overcome the time latency and communication overhead and to bring computing and storage resources close to both the ground and the end-users. However, fog computing inherits the same security and privacy challenges encountered by traditional cloud computing. This paper proposed a fine-grained data access control approach by integrating the ciphertext policy attribute-based encryption (CP-ABE) algorithm and blockchain technology to secure end-users’ data security against rogue fog nodes in case a compromised fog node is ousted. In this approach, we proposed federations of fog nodes that share the same attributes, such as services and locations. The fog federation concept minimizes the time latency and communication overhead between fog nodes and cloud servers. Furthermore, the blockchain idea and the CP-ABE algorithm integration allow for fog nodes within the same fog federation to conduct a distributed authorization process. Besides that, to address time latency and communication overhead issues, we equip each fog node with an off-chain database to store the most frequently accessed data files for a particular time, as well as an on-chain access control policies table (on-chain files tracking table) that must be protected from tampering by rogue fog nodes. As a result, the blockchain plays a critical role here because it is tamper-proof by nature. We assess our approach’s efficiency and feasibility by conducting a simulation and analyzing its security and performance.


2002 ◽  
pp. 144-171 ◽  
Author(s):  
Karla A.V. Borges ◽  
Clodoveu A. Davis Jr. ◽  
Alberto H.F. Laender

This chapter addresses the relationship that exists between the nature of spatial information, spatial relationships, and spatial integrity constraints, and proposes the use of OMT-G (Borges et al., 1999; Borges et al., 2001), an object-oriented data model for geographic applications, at an early stage in the specification of integrity constraints in spatial databases. OMT-G provides appropriate primitives for representing spatial data, supports spatial relationships and allows the specification of spatial integrity rules (topological, semantic and user integrity rules) through its spatial primitives and spatial relationship constructs. Being an object-oriented data model, it also allows some spatial constraints to be encapsulated as methods associated to specific georeferenced classes. Once constraints are explicitly documented in the conceptual modeling phase, and methods to enforce the spatial integrity constraints are defined, the spatial database management system and the application must implement such constraints. This chapter does not cover integrity constraints associated to the representation of simple objects, such as constraints implicit to the geometric description of a polygon. Geometric constraints are related to the implementation, and are covered here in a higher level view, considering only the shape of geographic objects. Consistency rules associated with the representation of spatial objects are discussed in Laurini and Thompson (1992).


2019 ◽  
Vol 6 (6) ◽  
pp. 697
Author(s):  
Nuniek Fahriani ◽  
Harunur Rosyid

<p><strong><span style="font-family: Times New Roman; font-size: small;">Abstrak</span></strong></p><p>Kriptografi (<em>cryptography</em>) merupakan proses keamanan data untuk menjaga pesan (file) agar tidak “diganggu” oleh pihak ketiga. kriptografi memiliki unsur proses, yaitu : enkripsi, dekripsi, dan kunci. Menjadi kebutuhan <em>user</em> untuk menghindari ‘pihak ketiga’ yang bisa merubah, mengambil ataupun menghilangkan data secara fisik atau menjalankan fungsi program yang mengganggu sistem. Tingkat keaslian data menjadi bagian penting didalam sistem keamanan data. Jenis data berupa file yang berpotensi “dirusak” secara illegal tidak hanya berextention .doc bisa saja jenis file yang berextention file video. Untuk menjalankan fungsi dari sistem keamanan data file video, legalitas akses akan data sangat penting untuk<em> secure</em> sehingga tidak berakibat kepada penyalahgunaan hak akses data. Teknik yang digunakan untuk menunjang enkrip dan dekrip file video adalah menerapkan algoritma blowfish didalam implementasinya. Algoritma ini memiliki sistem keamanan yang variabel. Hasil ujicoba menggunakan 6 contoh file extention yang melalui teknik enkrip dan dekrip adalah : file extention .asf, .wmv, .avi, .3pp, .flv, .vob. Dibangun berbasis desktop.</p><p> </p><p><strong><em><span style="font-family: Times New Roman; font-size: small;">Abstract</span></em></strong></p><p><em><span style="font-family: Times New Roman; font-size: small;">Cryptography (cryptography) is a data security process to keep messages (files) from being "disturbed" by third parties. Cryptography has three basic functions, namely: encryption, decryption, and key. Being a user need to avoid 'third parties' who can change, retrieve or delete data physically or run program functions that interfere with the system. The level of authenticity of the data becomes an important part in the data security system. This type of data in the form of files that have the potential to be "tampered" illegally not only with the .doc extension can be the file type with the video file extension. To perform the function of the video file data security system, the legality of access to data is very important to secure so that it does not result in misuse of data access rights. The technique used to support encryption and decryption of video files is to apply the blowfish algorithm in its implementation. This algorithm has a variable security system. The test results using 6 sample file extensions that go through the encryption and decryption process are: file extension .asf, .wmv, .avi, .3pp, .flv, .vob. Desktop based.</span></em></p><p> </p>


2021 ◽  
Vol 11 (19) ◽  
pp. 8984
Author(s):  
Yunhee Kang ◽  
Jaehyuk Cho ◽  
Young B. Park

The Conventional Cloud Common Data Model (CDM) uses a centralized method of user identification and credentials. This needs to be solved in a decentralized way because there are limitations in interoperability such as closed identity management and identity leakage. In this paper, we propose a DID (Decentralized Identifier)-based cloud CDM that allows researchers to securely store medical research information by authenticating their identity and to access the CDM reliably. The proposed service model is used to provide the credential of the researcher in the process of creating and accessing CDM data in the designed secure cloud. This model is designed on a DID-based user-centric identification system to support the research of enrolled researchers in a cloud CDM environment involving multiple hospitals and laboratories. The prototype of the designed model is an extension of the encrypted CDM delivery method using DID and provides an identification system by limiting the use cases of CDM data by researchers registered in cloud CDM. Prototypes built for agent-based proof of concept (PoC) are leveraged to enhance security for researcher use of ophthalmic CDM data. For this, the CDM ID schema and ID definition are described by issuing IDs of CDM providers and CDM agents, limiting the IDs of researchers who are CDM users. The proposed method is to provide a framework for integrated and efficient data access control policy management. It provides strong security and ensures both the integrity and availability of CDM data.


Sign in / Sign up

Export Citation Format

Share Document