A secure data deletion scheme for IoT devices through key derivation encryption and data analysis

2020 ◽  
Vol 111 ◽  
pp. 741-753 ◽  
Author(s):  
Jinbo Xiong ◽  
Lei Chen ◽  
Md Zakirul Alam Bhuiyan ◽  
Chunjie Cao ◽  
Minshen Wang ◽  
...  
Author(s):  
Md. Adib Muhtasim ◽  
Syeda Ramisa Fariha ◽  
Rayhan Rashid ◽  
Nabila Islam ◽  
Mahbub Alam Majumdar

Author(s):  
Louisa R Jorm ◽  
Kim McGrail ◽  
J. Charles Victor ◽  
Kerina Jones ◽  
David Ford ◽  
...  

Overall objectives or goalMany health data linkage ecosystems across the world have designed and implemented secure data analysis environments as one of their controls to protect patient privacy and confidentiality. These have been shaped by local legislation and data governance policies, available IT infrastructure and resources, and the skills and imagination of their architects. However, at present their various features and functionalities have not been reviewed, synthesised or contrasted. Burton et al [1] have proposed 12 criteria for Data Safe Havens in health and healthcare, which they conceptualise broadly as encompassing data governance and ethics, quality and curation of data repositories, and data security. Under this definition, secure analysis environments, which may or may not be integrated with data repositories, are a component of a Data Safe Haven, addressing the criterion “Appropriate secure access to individually identifying data”. To guide those building and operating these environments, and data custodians and stewards who need to assess their fitness-for-purpose, it would be of great value to discuss and agree an aggregate term (e.g. “Secure Data Lab”) that describes them, and to develop a more detailed set of criteria for what entails “Appropriate secure access” to linked health data. The goal of this session is to describe and document the approaches that have been taken by flagship secure data analysis environments internationally, including their approaches to authentication, assigning permissions, managing the ingress and egress of files and auditing transactions, and their responses to emerging opportunities, including cloud computing and national and international data sharing. We will explore how the interplay of physical, technical and procedural controls have been combined to create existing models, and the extent to which these can balance each other and be applied with flexibility depending on perceived risk and regimes. Session structurePrior to the session, we will develop a draft set of criteria for “Appropriate secure access” to linked health data. The session will comprise presentations describing existing secure analysis environments against the draft criteria, followed by a facilitated discussion. The secure data analysis environments that will be presented include: UNSW Sydney E-Research Institutional Cloud Architecture (ERICA) PopData BC Secure Research Environment (SRE) Institute for Clinical Evaluative Sciences (ICES) Data and Analytic Virtual Environment (IDAVE) Secure Anonymised Information Linkage (SAIL) Gateway Intended output or outcomeWe will write up the outcomes of the session as a scientific paper that proposes an aggregate term for secure data analysis environments for linked health data and a set of criteria for what entails “Appropriate secure access” to linked health data. Presenters and Facilitators Professor Louisa Jorm, Centre for Big Data Research in Health, UNSW Sydney, Australia Dr Tim Churches, South Western Sydney Clinical School, UNSW Sydney, Australia Professor Kim McGrail, Population Data BC, The University of British Columbia, Vancouver, Canada J. Charles Victor, Institute for Clinical Evaluative Sciences, Toronto, Canada Dr Kerina Jones, Swansea University Medical School, Wales, United Kingdom Professor David Ford, Swansea University Medical School, Wales, United Kingdom 1. Burton PR, Murtagh MJ, Boyd A, et al. Data Safe Havens in health research and healthcare. Bioinformatics 2015; 31(20): 3241–3248


2021 ◽  
Vol 11 (24) ◽  
pp. 11585
Author(s):  
Muhammad Muneeb ◽  
Kwang-Man Ko ◽  
Young-Hoon Park

The emergence of new technologies and the era of IoT which will be based on compute-intensive applications. These applications will increase the traffic volume of today’s network infrastructure and will impact more on emerging Fifth Generation (5G) system. Research is going in many details, such as how to provide automation in managing and configuring data analysis tasks over cloud and edges, and to achieve minimum latency and bandwidth consumption with optimizing task allocation. The major challenge for researchers is to push the artificial intelligence to the edge to fully discover the potential of the fog computing paradigm. There are existing intelligence-based fog computing frameworks for IoT based applications, but research on Edge-Artificial Intelligence (Edge-AI) is still in its initial stage. Therefore, we chose to focus on data analytics and offloading in our proposed architecture. To address these problems, we have proposed a prototype of our architecture, which is a multi-layered architecture for data analysis between cloud and fog computing layers to perform latency- sensitive analysis with low latency. The main goal of this research is to use this multi-layer fog computing platform for enhancement of data analysis system based on IoT devices in real-time. Our research based on the policy of the OpenFog Consortium which will offer the good outcomes, but also surveillance and data analysis functionalities. We presented through case studies that our proposed prototype architecture outperformed the cloud-only environment in delay-time, network usage, and energy consumption.


2019 ◽  
Vol 8 (2S11) ◽  
pp. 1083-1086

In recent years everything is connected and passing through the internet, but Internet of Things (IOT), which will change all aspects of our lives and future. While the things are connected to the internet, they will generate the huge amount of information which has to be processed. The information that gathered from various IoT devices has to be recognized and organized according to the environments of their type. To recognize and organize the data gathered from different things, the important task to be played is making things passing through different Data Mining Techniques (DMT). In this article, we mainly focus on analysis of various Data Mining Techniques over the data that has been generated by the IOT Devices which are connected over the internet using DBSCAN Technique. And also performed review over different Data Mining Techniques for Data Analysis


2020 ◽  
Vol 14 (1) ◽  
pp. 57-63
Author(s):  
Andrés Armando Sánchez Martin ◽  
Luis Eduardo Barreto Santamaría ◽  
Juan José Ochoa Ortiz ◽  
Sebastián Enrique Villanueva Navarro

One of the difficulties for the development and testing of data analysis applications used by IoT devices is the economic and temporary cost of building the IoT network, to mitigate these costs and expedite the development of IoT and analytical applications, it is proposed NIOTE, an IoT network emulator that generates sensor and actuator data from different devices that are easy to configure and deploy over TCP/IP and MQTT protocols, this tool serves as support in academic environments and conceptual validation in the design of IoT networks. The emulator facilitates the development of this type of application, optimizing the development time and improving the final quality of the product. Object-oriented programming concepts, architecture, and software design patterns are used to develop this emulator, which allows us to emulate the behavior of IoT devices that are inside a specific network, where you can add the number of necessary devices, model and design any network. Each network sends data that is stored locally to emulate the process of transporting the data to a platform, through a specific format and will be sent to perform Data Analysis.


Since The Mechanical Miracle Happened In Light Of The Fact That Most Of The Calculations And Estimations Of The Segments Required For The Collecting And Limit Of A Particular Thing Has Been Done Physically But At This Point This Example Has Changed With The Introduction Of Iot Devices In Organizations. These Gadgets Impart And Share Information With One Another Over Web To Make The Way Toward Assembling Less Confounds And Efficient. For Instance, A Sensor Has Been Appended To The Machine Which Estimates The Measure Of Vibrations Which Are Delivered While The Machine Is Working. This information is then moved to a server over WLAN which can be viewed to or by any individual who is working in the organization and after that be examinations to know the state of the said machine. Along the lines any meltdown or breakage in system or in process that can occur because of unseemly use of the machine can be avoided. This information can be seen by individuals who work for the organization, who has a login id and secret key and has a gadget with a working web association as this information is visible or displayed on a site that can be gotten to on any gadget utilizing any working framework. The server that stores the information is kept running on MS Windows and the webpage from which the information can be gotten to is made utilizing different web improvement dialects.


2013 ◽  
Vol 22 ◽  
pp. 152-161 ◽  
Author(s):  
Yun-Ching Liu ◽  
Yi-Ting Chiang ◽  
Tsan-Sheng Hsu ◽  
Churn-Jung Liau ◽  
Da-Wei Wang

Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 600
Author(s):  
Gianluca Cornetta ◽  
Abdellah Touhafi

Low-cost, high-performance embedded devices are proliferating and a plethora of new platforms are available on the market. Some of them either have embedded GPUs or the possibility to be connected to external Machine Learning (ML) algorithm hardware accelerators. These enhanced hardware features enable new applications in which AI-powered smart objects can effectively and pervasively run in real-time distributed ML algorithms, shifting part of the raw data analysis and processing from cloud or edge to the device itself. In such context, Artificial Intelligence (AI) can be considered as the backbone of the next generation of Internet of the Things (IoT) devices, which will no longer merely be data collectors and forwarders, but really “smart” devices with built-in data wrangling and data analysis features that leverage lightweight machine learning algorithms to make autonomous decisions on the field. This work thoroughly reviews and analyses the most popular ML algorithms, with particular emphasis on those that are more suitable to run on resource-constrained embedded devices. In addition, several machine learning algorithms have been built on top of a custom multi-dimensional array library. The designed framework has been evaluated and its performance stressed on Raspberry Pi III- and IV-embedded computers.


Sign in / Sign up

Export Citation Format

Share Document