scholarly journals TIME SCALE FOR MODELING BEACH CHANGE

1986 ◽  
Vol 1 (20) ◽  
pp. 88 ◽  
Author(s):  
Masahiro Ito ◽  
Yoshito Tsuchiya

A time scale in the similarity of beach change between model and prototype in transitional beach processes from an initial even slope to an equilibrium is developed using a series of small- and large-scale experiments in which the experimental conditions were set up with the scale-model relationship by the authors (1984). The time scale is obtained empirically as a function of experimental scale. Applied the proposed time scale and the scale-model relationship to model experiments, similarity of morphological beach change such as shoreline change and relative breaker point was well reproduced within the allowable range of experimental error. A semi-theoretical time scale is obtained from the continuity equation, the sediment transport rate, and the scale-model relationship of equilibrium beach profile in two-dimensional beach change. The relation between experimental and semi-theoretical time scale is discussed.

Author(s):  
Karthik Chandran ◽  
Weidong Zhang ◽  
Rajalakshmi Murugesan ◽  
S. Prasanna ◽  
A. Baseera ◽  
...  

Decentralized Model Reference Adaptive Control problems are investigated for a class of linear time-invariant two time-scale model, having fast and slow dynamics and unmatched interconnections. Design of full state feedback controller is a critical task to the system having interrelated dynamics and nonlinear interconnections of time varying lags, however, can be addressed by singular perturbation procedures and time scale modeling. In this research, full order observer-based state feedback control is designed to two-time scale system to ensure the stability of the closed loop system. Then, decentralized model reference adaptive controller with the novel reference model is designed to individual subsystems. It is found that the proposed design enforces the system states to track the reference state asymptotically. To investigate the proposed design, an example of drying process is considered. Simulation results are analyzed to confirm the efficacy of the proposed control scheme.


2018 ◽  
Author(s):  
Luc Vandenbulcke ◽  
Alexander Barth

Abstract. Traditionnally, in order for lower-resolution, global- or basin-scale models to benefit from some of the improvements available in higher-resolution regional or coastal models, two-way nesting has to be used. This implies that the parent and child models have to be run together and there is an online exchange of information between both models. This approach is often impossible in operational systems, where different model codes are run by different institutions, often in different countries. Therefor, in practice, these systems use one-way nesting with data transfer only from the large-scale model to the regional models. In this article, it is examined whether it is possible to replace the missing model feedback by data assimilation, avoiding the need to run the models simultaneously. Selected variables from the high-resolution forecasts will be used as pseudo-observations, and assimilated in the lower-resolution models. The method will be called upscaling. A realistic test-case is set up with a model covering the Mediterranean Sea, and a nested model covering its North-Western basin. A simulation using only the basin-scale model is compared with a simulation where both models are run using one-way nesting, and using the upscaling technique on the temperature and salinity variables. It is shown that the representation of some processes, such as the Rhône river plume, are strongly improved in the upscaled model compared to the stand-alone model.


2010 ◽  
Vol 133-134 ◽  
pp. 497-502 ◽  
Author(s):  
Alvaro Quinonez ◽  
Jennifer Zessin ◽  
Aissata Nutzel ◽  
John Ochsendorf

Experiments may be used to verify numerical and analytical results, but large-scale model testing is associated with high costs and lengthy set-up times. In contrast, small-scale model testing is inexpensive, non-invasive, and easy to replicate over several trials. This paper proposes a new method of masonry model generation using three-dimensional printing technology. Small-scale models are created as an assemblage of individual blocks representing the original structure’s geometry and stereotomy. Two model domes are tested to collapse due to outward support displacements, and experimental data from these tests is compared with analytical predictions. Results of these experiments provide a strong understanding of the mechanics of actual masonry structures and can be used to demonstrate the structural capacity of masonry structures with extensive cracking. Challenges for this work, such as imperfections in the model geometry and construction problems, are also addressed. This experimental method can provide a low-cost alternative for the collapse analysis of complex masonry structures, the safety of which depends primarily on stability rather than material strength.


2006 ◽  
Vol 18 (12) ◽  
pp. 2923-2927 ◽  
Author(s):  
Robert J. Calin-Jageman ◽  
Paul S. Katz

After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a “screen-saver” cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.


1984 ◽  
Vol 1 (19) ◽  
pp. 95 ◽  
Author(s):  
Masahiro Ito ◽  
Yoshito Tsuchiya

This paper presents a scale-model relationship for the similarity between large and small scale-models in two-dimensional equilibrium beach profiles. Taking large scale-models using large scale equipment as prototypes, the experimental scale of a medium-sized model was gradually varied keeping the grain size ratio of model to prototype constant. A similarity-comparison between large and small scale beach profiles is made by considering the degree of experimental errors. Judgement results are graphically shown, and a scale-model relationship is proposed. It is found that the scale-model relationship proposed agrees with the ones derived from the empirical formulae expressing the properties of beach profiles. Additionally, the applicability of this scale-model relationship to the reproduction test of natural beaches is examined.


2019 ◽  
Vol 92 ◽  
pp. 16007
Author(s):  
Noboru Sato ◽  
Toshikazu Sawamatsu ◽  
Takehiko Nitta ◽  
Hiroaki Miyatake ◽  
Kazuhito Kondo

In this study, an inclined model experiment and finite element analyses were conducted to evaluate the failure mode and seismic response of a dry-type large-scale concrete-block retaining wall (LCBW). In the experiment, the objective was to reproduce the sliding between concrete blocks that was observed in past cases LCBW damage in order to characterise the behaviour until failure. A numerical simulation corresponding to the experimental conditions was conducted by the finite element method (FEM). Dynamic analyses were also performed by FEM to investigate the seismic response of the concrete blocks under various ground conditions. The experimental results revealed that slip between the concrete blocks caused brittle failure of the LCBW. In the FEM simulation, the joint elements reproduced the experimentally observed sliding between the concrete blocks. A dynamic simulation of the full-scale model revealed that significant sliding and rocking of the concrete block occur in a dry-type LCBW. These findings indicate that stress concentration may occur at the heels of the concrete blocks during an earthquake.


2014 ◽  
Vol 1 (2) ◽  
pp. 1715-1734
Author(s):  
L. K. Feschenko ◽  
G. M. Vodinchar

Abstract. Inversion of the magnetic field in a large-scale model of αΩ-dynamo with nonlocal α-effect is under the investigation. The model allows us to reproduce the main features of the geomagnetic field reversals. It was established that the polarity intervals in the model are distributed according to the power law. Model magnetic polarity time scale is fractal. Its dimension is consistent with the dimension of the real geomagnetic polarity time scale.


PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0257537
Author(s):  
Estel Aparicio-Prat ◽  
Dong Yan ◽  
Marco Mariotti ◽  
Michael Bassik ◽  
Gaelen Hess ◽  
...  

CRISPR base editors are powerful tools for large-scale mutagenesis studies. This kind of approach can elucidate the mechanism of action of compounds, a key process in drug discovery. Here, we explore the utility of base editors in an early drug discovery context focusing on G-protein coupled receptors. A pooled mutagenesis screening framework was set up based on a modified version of the CRISPR-X base editor system. We determine optimized experimental conditions for mutagenesis where sgRNAs are delivered by cell transfection or viral infection over extended time periods (>14 days), resulting in high mutagenesis produced in a short region located at -4/+8 nucleotides with respect to the sgRNA match. The β2 Adrenergic Receptor (B2AR) was targeted in this way employing a 6xCRE-mCherry reporter system to monitor its response to isoproterenol. The results of our screening indicate that residue 184 of B2AR is crucial for its activation. Based on our experience, we outline the crucial points to consider when designing and performing CRISPR-based pooled mutagenesis screening, including the typical technical hurdles encountered when studying compound pharmacology.


2018 ◽  
Vol 15 (03) ◽  
pp. 1850009 ◽  
Author(s):  
Xiujuan Liu ◽  
Haijun Wu ◽  
Weikang Jiang

The coefficient matrices of conventional boundary element method (CBEM) are dense and fully populated. Special techniques such as hierarchical matrices (H-matrices) format are required to extent its ability of handling large-scale problems. Adaptive cross approximation (ACA) algorithm is a widely adopted algorithm to obtain the H-matrices. However, the accuracy of the ACA boundary element method (ACABEM) cannot be adjusted by changing the tolerance [Formula: see text] when it exceeds a certain value. In this paper, the degenerate kernel approximation idea for the low-rank matrices is developed to build a fast BEM for acoustic problems by exploring the multipole expansion of the kernel, which is referred as the multipole expansion H-matrices boundary element method (ME-H-BEM). The newly developed algorithm compresses the far-field submatrices into low rank submatrices with the expansion terms of Green’s function. The obtained H-matrices are applied in conjunction with the generalized minimal residual method (GMRES) to solve acoustic problems. Numerical examples are carefully set up to compare the accuracy, efficiency as well as memory consumption of the CBEM, ACABEM, fast multipole boundary element method (FMBEM) and ME-H-BEM. The results of a pulsating sphere indicate that the ME-H-BEM keeps both storage and operation logarithmic-linear complexity of the H-matrices format as the ACABEM does. Moreover, the ME-H-BEM can achieve better convergence and higher accuracy than the ACABEM. For the analyzed complicated large-scale model, the ME-H-BEM with appropriate number of expansion terms has an advantage in terms of efficiency as compared with the ACABEM. Compared with the FMBEM, the ME-H-BEM is easier to be implemented.


2015 ◽  
Vol 2 (11) ◽  
pp. 150428 ◽  
Author(s):  
Pierre Broly ◽  
Romain Mullier ◽  
Cédric Devigne ◽  
Jean-Louis Deneubourg

In a patchy environment, how social animals manage conspecific and environmental cues in their choice of habitat is a leading issue for understanding their spatial distribution and their exploitation of resources. Here, we experimentally tested the effects of environmental heterogeneities (artificial shelters) and some of their characteristics (size and fragmentation) on the aggregation process of a common species of terrestrial isopod (Crustacea). One hundred individuals were introduced into three different heterogeneous set-ups and in a homogeneous set-up. In the four set-ups, the populations split into two aggregates: one large (approx. 70 individuals) and one smaller (approx. 20 individuals). These aggregates were not randomly distributed in the arena but were formed diametrically opposite from one another. The similarity of the results among the four set-ups shows that under experimental conditions, the environmental heterogeneities have a low impact on the aggregation dynamics and spatial patterns of the isopod, merely serving to increase the probability of nucleation of the larger aggregation at these points. By contrast, the regulation of aggregate sizes and the regular distribution of groups are signatures of local amplification processes, in agreement with the short-range activator and long-range inhibitor model (scale-dependent feedbacks). In other words, we show how small-scale interactions may govern large-scale spatial patterns. This experimental illustration of spatial self-organization is an important step towards comprehension of the complex game of competition among groups in social species.


Sign in / Sign up

Export Citation Format

Share Document