Effective thickness determination for volume transmission multiplex holograms

1986 ◽  
Vol 64 (5) ◽  
pp. 553-557 ◽  
Author(s):  
Jean J. A. Couture ◽  
R. A. Lessard

The effective thickness of absorption multiplex holograms are evaluated by a numerical analysis of angular-selectivity curves obtained by the reconstruction process. These transmission holograms were respectively recorded in sequential and incoherent superposition of 2, 10, or 21 volume-coupled gratings. The experimental effective thickness represents 50% of the nominal thickness associated with Kodak 649F plates for 21 recorded, coupled gratings. The many experimental unpublished curves presented in this paper for 10 and 21 gratings permit one to obtain interesting numerical values of the thickness and angular-selectivity bandwidth of resultant multiplex holograms.

2021 ◽  
Author(s):  
Manuel Chevalier

Abstract. Statistical climate reconstruction techniques are practical tools to study past climate variability from fossil proxy data. In particular, the methods based on probability density functions (PDFs) are powerful at producing robust results from various environments and proxies. However, accessing and curating the necessary calibration data, as well as the complexity of interpreting probabilistic results, often limit their use in palaeoclimatological studies. To address these problems, I present a new R package (crestr) to apply the CREST method (Climate REconstruction SofTware) on diverse palaeoecological datasets. crestr includes a globally curated calibration dataset for six common climate proxies (i.e. plants, beetles, chironomids, rodents, foraminifera, and dinoflagellate cysts) that enables its use in most terrestrial and marine regions. The package can also be used with private data collections instead of, or in combination with, the provided dataset. It also includes a suite of graphical diagnostic tools to represent the data at each step of the reconstruction process and provide insights into the effect of the different modelling assumptions and external factors that underlie a reconstruction. With this R package, the CREST method can now be used in a scriptable environment, thus simplifying its use and integration in existing workflows. It is hoped that crestr will contribute to producing the much-needed quantified records from the many regions where climate reconstructions are currently lacking, despite the existence of suitable fossil records.


Author(s):  
Yaming Wang ◽  
Zhikang Luo ◽  
Weqing Huang ◽  
Yonghua Han

Although neural networks are most commonly used in the field of image super-resolution (SR), methods based on decision trees are still discussed. These kinds of algorithm need less time to compute than others because of their simple structure but still yield high quality image SR. In this paper, we propose an SR algorithm using the multi-grained cascade forest (SRGCF) method. Our algorithm first uses multi-grained scanning to process the spatial relationships of image features, thus the representational learning ability is improved. During the reconstruction process, the image obtained by cascade forest training is used as the input of the next training, therefore, the image features are continuously emphasized. The training of the cascade forest ends when the evaluation value is optimal. Because the decision tree uses a divide-and-conquer strategy, the SR of an image is improved in an iterative manner simply and quickly. Compared with existing methods, our method not only avoids the tradeoff between reconstruction quality and run time, but also has a good generalization capability. It can be quickly applied to the many cases of image SR.


Author(s):  
Wolfgang Willenberg ◽  
Marcus Stoffel ◽  
Dieter Weichert

For medical applications, it is desirable to cultivate tendon cells. In addition to the many biochemical requirements for successful cultivation, mechanical stimulation also plays an important role. Especially, it is well known that tendon cells de-differentiate quickly if they are not put under physiological conditions. For this reason, a new bioreactor for the investigation and cultivation of tenocytes is developed, in which tenocytes are seeded on a carrier material. To be able to identify the real loads the tenocytes are subjected to, the material properties of the carrier material are found by performing material tests followed by a numerical analysis.


2017 ◽  
Vol 1 (2) ◽  
pp. 380-395 ◽  
Author(s):  
Fabrizio Ivan Apollonio ◽  
Federico Fallavollita ◽  
Elisabetta Caterina Giovannini ◽  
Riccardo Foschi ◽  
Salvatore Corso

Among the many cases concerning the process of digital hypothetical 3D reconstruction a particular case is constituted by never realized projects and plans. They constitute projects designed and remained on paper that, albeit documented by technical drawings, they pose the typical problems that are common to all other cases. From 3D reconstructions of transformed architectures, to destroyed/lost buildings and part of towns.This case studies start from original old drawings which has to be implemented by different kind of documentary sources, able to provide - by means evidence, induction, deduction, analogy - information characterized by different level of uncertainty and related to different level of accuracy.All methods adopted in a digital hypothetical 3D reconstruction process show us that the goal of all researchers is to be able to make explicit, or at least intelligible, through a graphical system a synthetic/communicative level representative or the value of the reconstructive process that is behind a particular result.The result of a reconstructive process acts in the definition of three areas intimately related one each other which concur to define the digital consistency of the artifact object of study: Shape (geometry, size, spatial position); Appearance (surface features); Constitutive elements (physical form, stratification of building/manufacturing systems)The paper, within a general framework aimed to use 3D models as a means to document and communicate the shape and appearance of never built architecture, as well as to depict temporal correspondence and allow the traceability of uncertainty and accuracy that characterizes each reconstructed element.  


VLSI Design ◽  
1998 ◽  
Vol 8 (1-4) ◽  
pp. 179-184
Author(s):  
I. V. Zozoulenko ◽  
K.-F. Berggren

Electron transport was studied in an open square quantum dot with a dimension typical for current experiments. A numerical analysis of the probability density distribution inside the dot was performed which enabled us to unambiguously map the resonant states which dominate the conductance of the structure. It was shown that, despite of the presence of dot openings, transport through the dot is effectively mediated by just a few (or even a single) eigenstates of the corresponding closed structure. In a single-mode regime in the leads, the broadening of the resonant levels is typically smaller than the mean energy level spacing, Δ. On the contrary, in the many-mode regime this broadening typically exceeds Δ and has an irregular, essentially non-Lorentzian, character. It was demonstrated that in the latter case eigenlevel spacing statistics of the corresponding closed system are not relevant to the averaged transport properties of the dot. This conclusion seems to have a number of experimental as well as numerical verifications.


2006 ◽  
Vol 2006 ◽  
pp. 1-11
Author(s):  
Jaegwi Go ◽  
Youngmi Choi ◽  
Zhong Bo Fang

The many interesting phenomena, such as snap-through, nonuniqueness, and stability, of a circular arch subjected to the symmetric pressure are studied. The balance of forces of an elemental length leads to a governing equation which is used to investigate stable states of the arch. With specific opening angle2a=π/3, the sensitivities of angle and curvature at a base are surveyed for various of spring constants. The variations of angle and curvature at an edge are almost the same if the spring constantτ≥10.


2018 ◽  
Vol 41 ◽  
Author(s):  
Wei Ji Ma

AbstractGiven the many types of suboptimality in perception, I ask how one should test for multiple forms of suboptimality at the same time – or, more generally, how one should compare process models that can differ in any or all of the multiple components. In analogy to factorial experimental design, I advocate for factorial model comparison.


2020 ◽  
Vol 43 ◽  
Author(s):  
David Spurrett

Abstract Comprehensive accounts of resource-rational attempts to maximise utility shouldn't ignore the demands of constructing utility representations. This can be onerous when, as in humans, there are many rewarding modalities. Another thing best not ignored is the processing demands of making functional activity out of the many degrees of freedom of a body. The target article is almost silent on both.


Sign in / Sign up

Export Citation Format

Share Document