scholarly journals Scaling limits for a random boxes model

2019 ◽  
Vol 51 (03) ◽  
pp. 773-801
Author(s):  
F. Aurzada ◽  
S. Schwinn

AbstractWe consider random rectangles in $\mathbb{R}^2$ that are distributed according to a Poisson random measure, i.e. independently and uniformly scattered in the plane. The distributions of the length and the width of the rectangles are heavy tailed with different parameters. We investigate the scaling behaviour of the related random fields as the intensity of the random measure grows to infinity while the mean edge lengths tend to zero. We characterise the arising scaling regimes, identify the limiting random fields, and give statistical properties of these limits.

2016 ◽  
Vol 53 (3) ◽  
pp. 857-879 ◽  
Author(s):  
Vytautė Pilipauskaitė ◽  
Donatas Surgailis

AbstractWe obtain a complete description of anisotropic scaling limits of the random grain model on the plane with heavy-tailed grain area distribution. The scaling limits have either independent or completely dependent increments along one or both coordinate axes and include stable, Gaussian, and ‘intermediate’ infinitely divisible random fields. The asymptotic form of the covariance function of the random grain model is obtained. Application to superimposed network traffic is included.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1323
Author(s):  
Gareth W. Peters ◽  
Ido Nevat ◽  
Sai Ganesh Nagarajan ◽  
Tomoko Matsui

A class of models for non-Gaussian spatial random fields is explored for spatial field reconstruction in environmental and sensor network monitoring. The family of models explored utilises a class of transformation functions known as Tukey g-and-h transformations to create a family of warped spatial Gaussian process models which can support various desirable features such as flexible marginal distributions, which can be skewed, leptokurtic and/or heavy-tailed. The resulting model is widely applicable in a range of spatial field reconstruction applications. To utilise the model in applications in practice, it is important to carefully characterise the statistical properties of the Tukey g-and-h random fields. In this work, we study both the properties of the resulting warped Gaussian processes as well as using the characterising statistical properties of the warped processes to obtain flexible spatial field reconstructions. In this regard we derive five different estimators for various important quantities often considered in spatial field reconstruction problems. These include the multi-point Minimum Mean Squared Error (MMSE) estimators, the multi-point Maximum A-Posteriori (MAP) estimators, an efficient class of multi-point linear estimators based on the Spatial-Best Linear Unbiased (S-BLUE) estimators, and two multi-point threshold exceedance based estimators, namely the Spatial Regional and Level Exceedance estimators. Simulation results and real data examples show the benefits of using the Tukey g-and-h transformation as opposed to standard Gaussian spatial random fields in a real data application for environmental monitoring.


2003 ◽  
Vol 35 (03) ◽  
pp. 793-805 ◽  
Author(s):  
Sem Borst ◽  
Bert Zwart

We determine the exact large-buffer asymptotics for a mixture of light-tailed and heavy-tailed input flows. Earlier studies have found a ‘reduced-load equivalence’ in situations where the peak rate of the heavy-tailed flows plus the mean rate of the light-tailed flows is larger than the service rate. In that case, the workload is asymptotically equivalent to that in a reduced system, which consists of a certain ‘dominant’ subset of the heavy-tailed flows, with the service rate reduced by the mean rate of all other flows. In the present paper, we focus on the opposite case where the peak rate of the heavy-tailed flows plus the mean rate of the light-tailed flows is smaller than the service rate. Under mild assumptions, we prove that the workload distribution is asymptotically equivalent to that in a somewhat ‘dual’ reduced system, multiplied by a certain prefactor. The reduced system now consists of only the light-tailed flows, with the service rate reduced by the peak rate of the heavy-tailed flows. The prefactor represents the probability that the heavy-tailed flows have sent at their peak rate for more than a certain amount of time, which may be interpreted as the ‘time to overflow’ for the light-tailed flows in the reduced system. The results provide crucial insight into the typical overflow scenario.


2004 ◽  
Vol 25 (2) ◽  
pp. 217-234 ◽  
Author(s):  
Piotr Kokoszka ◽  
Michael Wolf

2013 ◽  
Vol 730 ◽  
pp. 593-606 ◽  
Author(s):  
L. Djenidi ◽  
S. F. Tardu ◽  
R. A. Antonia

AbstractA long-time direct numerical simulation (DNS) based on the lattice Boltzmann method is carried out for grid turbulence with the view to compare spatially averaged statistical properties in planes perpendicular to the mean flow with their temporal counterparts. The results show that the two averages become equal a short distance downstream of the grid. This equality indicates that the flow has become homogeneous in a plane perpendicular to the mean flow. This is an important result, since it confirms that hot-wire measurements are appropriate for testing theoretical results based on spatially averaged statistics. It is equally important in the context of DNS of grid turbulence, since it justifies the use of spatial averaging along a lateral direction and over several realizations for determining various statistical properties. Finally, the very good agreement between temporal and spatial averages validates the comparison between temporal (experiments) and spatial (DNS) statistical properties. The results are also interesting because, since the flow is stationary in time and spatially homogeneous along lateral directions, the equality between the two types of averaging provides strong support for the ergodic hypothesis in grid turbulence in planes perpendicular to the mean flow.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Khalid Oufdil

Abstract In this paper, we study one-dimensional backward stochastic differential equations under logarithmic growth in the 𝑧-variable ( | z | ⁢ | ln ⁡ | z | | ) (\lvert z\rvert\sqrt{\lvert\ln\lvert z\rvert\rvert}) . We show the existence and the uniqueness of the solution when the noise is driven by a Brownian motion and an independent Poisson random measure. In addition, we highlight the connection of such BSDEs with stochastic optimal control problem, where we show the existence of an optimal strategy for the control problem.


2018 ◽  
Vol 50 (3) ◽  
pp. 706-725
Author(s):  
Julie Fournier

Abstract A deterministic application θ:ℝ2→ℝ2 deforms bijectively and regularly the plane and allows the construction of a deformed random field X∘θ:ℝ2→ℝ from a regular, stationary, and isotropic random field X:ℝ2→ℝ. The deformed field X∘θ is, in general, not isotropic (and not even stationary), however, we provide an explicit characterization of the deformations θ that preserve the isotropy. Further assuming that X is Gaussian, we introduce a weak form of isotropy of the field X∘θ, defined by an invariance property of the mean Euler characteristic of some of its excursion sets. We prove that deformed fields satisfying this property are strictly isotropic. In addition, we are able to identify θ, assuming that the mean Euler characteristic of excursion sets of X∘θ over some basic domain is known.


2002 ◽  
Vol 17 (1-2) ◽  
pp. 19-26 ◽  
Author(s):  
Ljiljana Kostic

Feynman-alpha and Rossi-alpha methods are used in traditional nuclear reactors to determine the subcritical reactivity of a system. The methods are based on the measurement of the mean value, variance and the covariance of detector counts for different measurement times. Such methods attracted renewed attention recently with the advent of the so-called accelerator driven reactors (ADS) proposed some time ago. The ADS systems, intended to be used either in energy generation or transuranium transmutation, will use a subcritical core with a strong spallation source. A spallation source has statistical properties that are different from those traditionally used by radioactive sources. In such reactors the monitoring of the subcritical reactivity is very important, and a statistical method, such as the Feynman-alpha method, is capable of resolving this problem.


Sign in / Sign up

Export Citation Format

Share Document