Two-dimensional probability distribution of systems driven by colored noise

1991 ◽  
Vol 43 (2) ◽  
pp. 700-706 ◽  
Author(s):  
Hu Gang
2018 ◽  
Vol 23 ◽  
pp. 00037 ◽  
Author(s):  
Stanisław Węglarczyk

Kernel density estimation is a technique for estimation of probability density function that is a must-have enabling the user to better analyse the studied probability distribution than when using a traditional histogram. Unlike the histogram, the kernel technique produces smooth estimate of the pdf, uses all sample points' locations and more convincingly suggest multimodality. In its two-dimensional applications, kernel estimation is even better as the 2D histogram requires additionally to define the orientation of 2D bins. Two concepts play fundamental role in kernel estimation: kernel function shape and coefficient of smoothness, of which the latter is crucial to the method. Several real-life examples, both for univariate and bivariate applications, are shown.


Nodes are treated as characteristic points of data for modeling and analyzing. The model of data can be built by choice of probability distribution function and nodes combination. Two-dimensional object is extrapolated and interpolated via nodes combination and different functions as discrete or continuous probability distribution functions: polynomial, sine, cosine, tangent, cotangent, logarithm, exponent, arc sin, arc cos, arc tan, arc cot or power function. Curve interpolation represents one of the most important problems in mathematics and computer science: how to model the curve via discrete set of two-dimensional points? Also the matter of shape representation (as closed curve - contour) and curve parameterization is still opened. For example pattern recognition, signature verification or handwriting identification problems are based on curve modeling via the choice of key points. So interpolation is not only a pure mathematical problem but important task in computer vision and artificial intelligence.


2011 ◽  
Vol 18 (1-2) ◽  
pp. 13-19 ◽  
Author(s):  
Zhong Lv ◽  
Huisu Chen ◽  
Haifeng Yuan

AbstractCracks are vitally detrimental to the load-bearing capacity of materials and further to the durability and service-life of various structures. Crack-repairing technology via embedded capsules with repair agent is becoming a promising approach to sustain the performance of structural materials. However, the appropriate dosage of capsulated repair agent for autonomic healing is not theoretically solved in the literature. In this study, taking cementitious materials as an example, the surface cracks in materials caused by various mechanisms are firstly simplified as linear cracks and zonal cracks in two-dimensional plane. Then, from the viewpoint of geometrical probability, the theoretical solutions on the exact dosage of capsules required are developed for different types of crack models via the knowledge of integral geometry and the concepts of probability distribution. Finally, reliability of these theoretical solutions is verified via computer modeling technology.


Author(s):  
TZE FEN LI ◽  
SHIAW-SHIAN YU

A simplified Bayes rule is used to classify 5401 categories of handwritten Chinese characters. The main feature for the Bayes rule deals with the probability distribution of black pixels of a thinned character. Our idea is that each Chinese character indicated by the black pixels represents a probability distribution in a two-dimensional plane. Therefore, an unknown pattern is classified into one of 5401 different distributions by the Bayes rule. Since the handwritten character has an irregular shape variation, the whole character is normalized and then thinned. Finally, a transformation is used to spread the black pixels uniformly over the whole square plane, but it still keeps the relative positions of the original black pixels. The main feature gives an 88.65% recognition rate. In order to raise the recognition rate, 4 more subsidiary features are elaborately selected such that they are not affected much by the irregularly shaped variation. The 4 features raise the recognition rate to 93.43%. A 99.30% recognition rate is achieved if the top 10 categories of HCC are selected by our recognition method and 99.61% if the top 20 are selected.


Fractals ◽  
2003 ◽  
Vol 11 (supp01) ◽  
pp. 19-27 ◽  
Author(s):  
M. BARTHELEMY ◽  
S. V. BULDYREV ◽  
S. HAVLIN ◽  
H. E. STANLEY

In a first part, we study the backbone connecting two given sites of a two-dimensional lattice separated by an arbitrary distance r in a system of size L. We find a scaling form for the average backbone mass and we also propose a scaling form for the probability distribution P(MB) of backbone mass for a given r. For r ≈ L, P(MB) is peaked around LdB, whereas for r ≪ L, P(MB) decreases as a power law, [Formula: see text], with τB ≃ 1.20 ± 0.03. The exponents ψ and τB satisfy the relation ψ = dB(τB - 1), and ψ is the codimension of the backbone, ψ = d - dB. In a second part, we study the multifractal spectrum of the current in the two-dimensional random resistor network at the percolation threshold. Our numerical results suggest that in the infinite system limit, the probability distribution behaves for small i as P(i) ~ 1/i where i is the current. As a consequence, the moments of i of order q ≤ qc = 0 diverge with system size, and all sets of bonds with current values below the most probable one have the fractal dimension of the backbone. Hence we hypothesize that the backbone can be described in terms of only (i) blobs of fractal dimension dB and (ii) high current carrying bonds of fractal dimension going from d red to dB, where d red is the fractal dimension of the red bonds carrying the maximal current.


Sign in / Sign up

Export Citation Format

Share Document