scholarly journals Lensless Optical Encryption of Multilevel Digital Data Containers Using Spatially Incoherent Illumination

2021 ◽  
Vol 12 (1) ◽  
pp. 406
Author(s):  
Pavel Cheremkhin ◽  
Nikolay Evtikhiev ◽  
Vitaly Krasnov ◽  
Ilya Ryabcev ◽  
Anna Shifrina ◽  
...  

The necessity of the correction of errors emerging during the optical encryption process led to the extensive use of data containers such as QR codes. However, due to specifics of optical encryption, QR codes are not very well suited for the task, which results in low error correction capabilities in optical experiments mainly due to easily breakable QR code’s service elements and byte data structure. In this paper, we present optical implementation of information optical encryption system utilizing new multilevel customizable digital data containers with high data density. The results of optical experiments demonstrate efficient error correction capabilities of the new data container.

2020 ◽  
Author(s):  
Xiaoyuan Wang ◽  
Pengfei Zhou ◽  
Jason Eshraghian ◽  
Chih-Yang Lin ◽  
Herbert Ho-Ching Iu ◽  
...  

<div>This paper presents the first experimental demonstration</div><div>of a ternary memristor-CMOS logic family. We systematically</div><div>design, simulate and experimentally verify the primitive</div><div>logic functions: the ternary AND, OR and NOT gates. These are then used to build combinational ternary NAND, NOR, XOR and XNOR gates, as well as data handling ternary MAX and MIN gates. Our simulations are performed using a 50-nm process which are verified with in-house fabricated indium-tin-oxide memristors, optimized for fast switching, high transconductance, and low current leakage. We obtain close to an order of magnitude improvement in data density over conventional CMOS logic, and a reduction of switching speed by a factor of 13 over prior state-of-the-art ternary memristor results. We anticipate extensions of this work can realize practical implementation where high data density is of critical importance.</div>


2021 ◽  
pp. 111-120
Author(s):  
Rob Kitchin

This chapter charts the transition from an analogue to a digital world, its effect on data footprints and shadows, and the growth of data brokers and government use of data. The World Wide Web (WWW) started to change things by making information accessible across the Internet through an easy-to-use, intuitive graphical interface. Using the Internet, people started leaving digital traces. In their everyday lives, their digital shadows were also growing through the use of debit, credit, and store loyalty cards, and captured in government databases which were increasingly digital. Running tandem to the creation of digital lifestyles was the datafication of everyday life. This was evident in a paper which examined the various ways in which digital data was being generated and tracked using indexical codes about people, but also objects, transactions, interactions, and territories, and how these data were being used to govern people and manage organizations. Today, people live in a world of continuous data production, since smart systems generate data in real time.


2019 ◽  
Vol 79 (9-10) ◽  
pp. 5719-5741 ◽  
Author(s):  
Longdan Tan ◽  
Yuliang Lu ◽  
Xuehu Yan ◽  
Lintao Liu ◽  
Xuan Zhou

AbstractQuick response (QR) codes are becoming increasingly popular in various areas of life due to the advantages of the error correction capacity, the ability to be scanned quickly and the capacity to contain meaningful content. The distribution of dark and light modules of a QR code looks random, but the content of a code can be decoded by a standard QR reader. Thus, a QR code is often used in combination with visual secret sharing (VSS) to generate meaningful shadows. There may be some losses in the process of distribution and preservation of the shadows. To recover secret images with high quality, it is necessary to consider the scheme’s robustness. However, few studies examine robustness of VSS combined with QR codes. In this paper, we propose a robust (k, n)-threshold XOR-ed VSS (XVSS) scheme based on a QR code with the error correction ability. Compared with OR-ed VSS (OVSS), XVSS can recover the secret image losslessly, and the amount of computation needed is low. Since the standard QR encoder does not check if the padding codewords are correct during the encoding phase, we replace padding codewords by initial shadows shared from the secret image using XVSS to generate QR code shadows. As a result, the shadows can be decoded normally, and their error correction abilities are preserved. Once all the shadows have been collected, the secret image can be recovered losslessly. More importantly, if some conventional image attacks, including rotation, JPEG compression, Gaussian noise, salt-and-pepper noise, cropping, resizing, and even the addition of camera and screen noises are performed on the shadows, the secret image can still be recovered. The experimental results and comparisons demonstrate the effectiveness of our scheme.


2006 ◽  
Vol 134 (8) ◽  
pp. 2033-2054 ◽  
Author(s):  
Michael J. Brennan ◽  
Gary M. Lackmann

Abstract Previous research has shown that a lower-tropospheric diabatically generated potential vorticity (PV) maximum associated with an area of incipient precipitation (IP) was critical to the moisture transport north of the PV maximum into the Carolinas and Virginia during the 24–25 January 2000 East Coast cyclone. This feature was almost entirely absent in short-term (e.g., 6–12 h) forecasts from the 0000 UTC 24 January 2000 operational runs of the National Centers for Environmental Prediction (NCEP) North American Mesoscale (NAM, formerly Eta) and Global Forecast System (GFS, formerly AVN) models, even though it occurred over land within and downstream of a region of relatively high data density. Observations and model analyses are used to document the forcing for ascent, moisture, and instability (elevated gravitational and/or symmetric) associated with the IP, and the evolution of the IP formation is documented with radar and satellite imagery with the goal of understanding the fundamental nature of this precipitation feature and the models’ inability to predict it. Results show that the IP formed along a zone of lower-tropospheric frontogenesis in a region of strong synoptic-scale forcing for ascent downstream of an approaching upper trough and jet streak. The atmosphere above the frontal inversion was characterized by a mixture of gravitational conditional instability and conditional symmetric instability over a deep layer, and this instability was likely released when air parcels reached saturation as they ascended the frontal surface. The presence of elevated convection is suggested by numerous surface reports of thunder and the cellular nature of radar echoes in the region. Short-term forecasts from the Eta and AVN models failed to capture the magnitude of the frontogenesis, upper forcing, or elevated instability in the region of IP formation. These findings suggest that errors in the initial condition analyses, particularly in the water vapor field, in conjunction with the inability of model physics schemes to generate the precipitation feature, likely played a role in the operational forecast errors related to inland quantitative precipitation forecasts (QPFs) later in the event. A subsequent study will serve to clarify the role of initial conditions and model physics in the representation of the IP by NWP models.


2012 ◽  
Vol 214 ◽  
pp. 749-754
Author(s):  
Xiong He ◽  
Yi Yang Gao ◽  
Tao Chen

This Paper Introduces a Method of Designing and Organizing Road Network Data and Clarifies Algorithm Based on Layering Search, Suitable for Computing the Routes of Vehicle Navigation in Big Districts. the Algorithm Is Calculated in the High Grade Road Network, and then in the Local Refinement. the Method Is to Get a Point in the Calculated High Grade Route and then Calculate the Optimal Route from the Start Point to the Point (the Selected Point Should Be a Node near to the End), so Does the End Point. the Algorithm Was Applied to the Routes Planning and the Experimental Results Show that the Use of Data Structure and Algorithm Saves Storage Space and Greatly Improves the Calculation Efficiency.


Author(s):  
Karel Hurts

Following up on a previous study showing the performance on integrated tasks for non-configural graphs to be superior to that for configural graphs if the memory for the graph is tested (retrospective or memory-based conditions), this paper further contrasts retrospective and concurrent (display-based) task performance. This was done by experimentally investigating the effect of various configural and non-configural static graphs on integrated task performance (requiring the consideration of lower-level graph information as well as higher-level graph information), using both retrospective and concurrent conditions. Subjects were asked to answer a question about each graph, which was phrased in terms of the domain of the data and which could not be easily anticipated. Graphs also differed in the amount of fit between graph structure and data structure (data-graph compatibility). The results confirmed the expectation that the reversal effect (inferior performance for configural graphs) is only found under memory-based conditions. Both display-based and memory-based performance were better for the configural graphs with high data-graph compatibility, although only significantly so for display-based search time. The two separable types of graphs could only be compared with respect to the amount of time needed to memorize the graphs: longer times were found for the graph type with low data-graph compatibility. However, the latter effect may also be due to a difference in data structure complexity, as this factor was confounded with data-graph compatibility in the two separable graph types. Although more research is needed to disambiguate some of the present results and to make other and better comparisons, the results of this study still show the importance of structural and semantic factors in determining the effectiveness of configurality in statistical graphs.


Author(s):  
Gregor Kennedy ◽  
Ioanna Ioannou ◽  
Yun Zhou ◽  
James Bailey ◽  
Stephen O'Leary

<p>The analysis and use of data generated by students’ interactions with learning systems or programs – learning analytics – has recently gained widespread attention in the educational technology community. Part of the reason for this interest is based on the potential of learning analytic techniques such as data mining to find hidden patterns in students’ online interactions that can be meaningfully interpreted and then fed back to students in a way that supports their learning. In this paper we present an investigation of how the digital data records of students’ interactions within an immersive 3D environment can be mined, modeled and analysed, to provide real-time formative feedback to students as they complete simulated surgical tasks. The issues that emerged in this investigation as well as areas for further research and development are discussed.</p>


Sign in / Sign up

Export Citation Format

Share Document