Gamut estimation using 2D surface splines

Author(s):  
Mark Shaw
Keyword(s):  
2004 ◽  
Vol 10 (2) ◽  
pp. 131-141
Author(s):  
Luc Schueremans ◽  
Dionys Van Gemert

Safety, reliability and risk are key issues in the preservation of our built, cultural heritage. Several structural collapses make us aware of the vulnerability of our technical and natural environment and demand an adequate engineering response. In the analysis phase, an objective way to assess the safety of the structure is essential. The present raises the need for a reliability based assessment framework for existing masonry structures. Although this field of research is relatively young, different techniques have been proposed and optimised. These permit to calculate the global probability of failure of complex structures, relying on deterministic techniques able to calculate the stability state for a prescribed set of parameters. This paper illustrates how these techniques can be a valid tool to evaluate the bearing capacity of existing structures. Focus is on reliability methods based on simulation procedures (Monte Carlo, Directional Sampling), combined with an adaptive meta‐model (Response Surface, Splines, Neural Networks). Several benchmark examples demonstrate the applicability of the methodology. The mutual efficiency of the different reliability algorithms is discussed. The application focuses on the assessment of an existing masonry structure. The overall stability of a Romanesque city wall of Leuven (B) is studied in detail. The analysis treats the present safety of the city wall, regarding the uncertainties in load, geometry and resistance. Because of the low degree of safety of several parts of the wall, consolidation measures and strengthening techniques are proposed.


1972 ◽  
Vol 9 (2) ◽  
pp. 189-191 ◽  
Author(s):  
ROBERT L. HARDER ◽  
ROBERT N. DESMARAIS
Keyword(s):  

Geophysics ◽  
1985 ◽  
Vol 50 (12) ◽  
pp. 2831-2848 ◽  
Author(s):  
Pedro Gonzalez‐Casanova ◽  
Roman Alvarez

Modeling and contouring of geophysical data often require distributions of regularly spaced values. Splines have been shown to be the most accurate methods to obtain such distributions. We emphasize the general problem of interpolating random distributions of data on a given surface. Splines are classified into unidimensional, quasi‐bidimensional, and strictly bidimensional; based on this classification, a systematic derivation of the corresponding interpolating techniques is conducted. Two approaches are presented to obtain unidimensional splines: one based on the continuity of the first and second derivatives of the polynomials involved, and the other based on a variational approach. Quasi‐bidimensional splines are constructed based on the unidimensional approach, while strictly bidimensional splines are generated by minimizing the bidimensional curvature. Quasi‐bidimensional splines can be used for processing data distributions along nearly parallel lines; linear projections and parameterization are the techniques used in interpolating this type of distribution. Strictly bidimensional splines minimize curvature through the analytic solution of the Euler‐Lagrange equation or by a finite‐difference algorithm. The maximum error, mean error, and standard deviation between interpolated data and exact field values produced by various prisms show that quasi‐bidimensional splines are 2.7 percent more accurate in the maximum error than strictly bidimensional splines when both techniques are applied to regularly spaced data. However, for irregularly spaced data, three examples containing 300, 600, and 900 random data points show the superiority of the thin‐plate approach over the quasi‐bidimensional splines. A comparison between various interpolation densities on regular grids, starting from a set of 327 randomly distributed magnetic stations, illustrates some differences between geophysically meaningful interpolations and interpolations carried out only for contouring purposes.


Sign in / Sign up

Export Citation Format

Share Document