A Fine‐Grained Distribution Approach for ETL Processes in Big Data Environments

2017 ◽  
Vol 111 ◽  
pp. 114-136 ◽  
Author(s):  
Mahfoud Bala ◽  
Omar Boussaid ◽  
Zaia Alimazighi
Keyword(s):  
Big Data ◽  
Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 461 ◽  
Author(s):  
Luliang Tang ◽  
Jie Gao ◽  
Chang Ren ◽  
Xia Zhang ◽  
Xue Yang ◽  
...  

The design of urban clusters has played an important role in urban planning, but realizing the construction of these urban plans is quite a long process. Hence, how the progress is evaluated is significant for urban managers in the process of urban construction. Traditional methods for detecting urban clusters are inaccurate since the raw data is generally collected from small sample questionnaires of resident trips rather than large-scale studies. Spatiotemporal big data provides a new lens for understanding urban clusters in a natural and fine-grained way. In this article, we propose a novel method for Detecting and Evaluating Urban Clusters (DEUC) with taxi trajectories and Sina Weibo check-in data. Firstly, DEUC applies an agglomerative hierarchical clustering method to detect urban clusters based on the similarities in the daily travel space of urban residents. Secondly, DEUC infers resident demands for land-use functions using a naïve Bayes’ theorem, and three indicators are adopted to assess the rationality of land-use functions in the detected clusters—namely, cross-regional travel index, commuting direction index, and fulfilled demand index. Thirdly, DEUC evaluates the progress of urban cluster construction by calculating a proposed conformance indicator. In the case study, we applied our method to detect and analyze urban clusters in Wuhan, China in the years 2009, 2014, and 2015. The results suggest the effectiveness of the proposed method, which can provide a scientific basis for urban construction.


2018 ◽  
Vol 173 ◽  
pp. 03047
Author(s):  
Zhao Li ◽  
Shuiyuan Huan

There are many security threats such as data’s confidentiality and privacy protection in the new application scenario of big data processing, and for the problems such as coarse granularity and low sharing capability existing in the current research on big data access control, a new model to support fine-grained access control and flexible attribute change is proposed. Based on CP-ABE method, a multi-level attribute-based encryption scheme is designed to solve fine-grained access control problem. And to solve the problem of attribute revocation, the technique of re-encryption and version number tag is integrated into the scheme. The analysis shows that the proposed scheme can meet the security requirement of access control in big data processing environment, and has an advantage in computational overhead compared with the previous schemes.


IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 47084-47095
Author(s):  
Gyeongjin Ra ◽  
Donghyun Kim ◽  
Daehee Seo ◽  
Imyeong Lee

2018 ◽  
Vol 4 (3) ◽  
pp. 205630511878450 ◽  
Author(s):  
Annette N Markham ◽  
Katrin Tiidenberg ◽  
Andrew Herman

This is an introduction to the special issue of “Ethics as Methods: Doing Ethics in the Era of Big Data Research.” Building on a variety of theoretical paradigms (i.e., critical theory, [new] materialism, feminist ethics, theory of cultural techniques) and frameworks (i.e., contextual integrity, deflationary perspective, ethics of care), the Special Issue contributes specific cases and fine-grained conceptual distinctions to ongoing discussions about the ethics in data-driven research. In the second decade of the 21st century, a grand narrative is emerging that posits knowledge derived from data analytics as true, because of the objective qualities of data, their means of collection and analysis, and the sheer size of the data set. The by-product of this grand narrative is that the qualitative aspects of behavior and experience that form the data are diminished, and the human is removed from the process of analysis. This situates data science as a process of analysis performed by the tool, which obscures human decisions in the process. The scholars involved in this Special Issue problematize the assumptions and trends in big data research and point out the crisis in accountability that emerges from using such data to make societal interventions. Our collaborators offer a range of answers to the question of how to configure ethics through a methodological framework in the context of the prevalence of big data, neural networks, and automated, algorithmic governance of much of human socia(bi)lity


2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Jianmin Wang ◽  
Yukun Xia ◽  
Wenbin Zhao ◽  
Yuhang Zhang ◽  
Feng Wu

Big data is massive and heterogeneous, along with the rapid increase in data quantity, and the diversification of user access, traditional database, and access control methods can no longer meet the requirements of big data storage and flexible access control. To solve this problem, an entity relationship completion and authority management method is proposed. By combining the weighted graph convolutional neural network and the attention mechanism, a knowledge base completion model is given. On this basis, the authority management model is formally defined and the process of multilevel trust access control is designed. The effectiveness of the proposed method is verified by experiments, and the authority management of knowledge base is more fine-grained and more secure.


Sign in / Sign up

Export Citation Format

Share Document