scholarly journals Smoothing Algorithm for Planar and Surface Mesh Based on Element Geometric Deformation

2015 ◽  
Vol 2015 ◽  
pp. 1-9 ◽  
Author(s):  
Shuli Sun ◽  
Minglei Zhang ◽  
Zhihong Gou

Smoothing is one of the basic procedures for improvement of mesh quality. In this paper, a novel and efficient smoothing approach for planar and surface mesh based on element geometric deformation is developed. The presented approach involves two main stages. The first stage is geometric deformation of all the individual elements through a specially designed two-step stretching-shrinking operation (SSO), which is performed by moving the vertices of each element according to a certain rule in order to get better shape of the element. The second stage is to determine the position of each node of the mesh by a weighted average strategy according to quality changes of its adjacent elements. The suggested SSO-based smoothing algorithm works efficiently for triangular mesh and can be naturally expanded to quadrilateral mesh, arbitrary polygonal mesh, and mixed mesh. Combined with quadratic error metric (QEM), this approach may be also applied to improve the quality of surface mesh. The proposed method is simple to program and inherently very suitable for parallelization, especially on graphic processing unit (GPU). Results of numerical experiments demonstrate the effectiveness and potential of this method.

1965 ◽  
Vol 41 (2) ◽  
pp. 215-221
Author(s):  
P. L. Northcott

The need to compare individuals is discussed briefly. It is suggested that the composite quality of an individual is best defined as the weighted sum of a number of measurable characteristics of the individual. A statistical procedure for comparison of weighted average quality is derived from application of the principle of the linear combination of variables. A digital computer program is available.


2018 ◽  
Vol 15 (144) ◽  
pp. 20180174 ◽  
Author(s):  
Sasikiran Kandula ◽  
Teresa Yamana ◽  
Sen Pei ◽  
Wan Yang ◽  
Haruka Morita ◽  
...  

A variety of mechanistic and statistical methods to forecast seasonal influenza have been proposed and are in use; however, the effects of various data issues and design choices (statistical versus mechanistic methods, for example) on the accuracy of these approaches have not been thoroughly assessed. Here, we compare the accuracy of three forecasting approaches—a mechanistic method, a weighted average of two statistical methods and a super-ensemble of eight statistical and mechanistic models—in predicting seven outbreak characteristics of seasonal influenza during the 2016–2017 season at the national and 10 regional levels in the USA. For each of these approaches, we report the effects of real time under- and over-reporting in surveillance systems, use of non-surveillance proxies of influenza activity and manual override of model predictions on forecast quality. Our results suggest that a meta-ensemble of statistical and mechanistic methods has better overall accuracy than the individual methods. Supplementing surveillance data with proxy estimates generally improves the quality of forecasts and transient reporting errors degrade the performance of all three approaches considerably. The improvement in quality from ad hoc and post-forecast changes suggests that domain experts continue to possess information that is not being sufficiently captured by current forecasting approaches.


2020 ◽  
pp. paper5-1-paper5-10
Author(s):  
Aleksandr Mezhenin ◽  
Anastasia Shevchenko

This paper discusses the optimization of procedurally generated polygonal landscapes. The Level of Detail method is considered and its shortcomings are listed. The proposed approach to solving the optimization problem based on the Ramer-Douglas-Pecker algorithms for the three-dimensional case, De-launay triangulation, and the Hausdorff metric is presented. To achieve bet-ter results in number of triangles of the optimized mesh and maintaining landscape detail than some LOD implementations the follows is proposed: from a height map, in a certain way, points are selected that most accurately convey the curvature and terrain features. Other points are deleted. The basis of the obtained points, an irregular triangulated grid is constructed. The pro-posed method for analyzing the similarity of polygonal models of an arbi-trary topological type can serve as a basis for the implementation of the cor-responding algorithms. The use of the weighted average when calculating the normal vectors, according to the authors, increases the accuracy of the sub-sequent calculation of the Hausdoff metric. Issues of assessing the quality of optimization are considered. A mathematical model is proposed. A proto-type of the optimizer for the polygonal mesh was developed and tested.


2018 ◽  
Author(s):  
Rebecca Floyd ◽  
David Leslie ◽  
Roland Baddeley ◽  
Simon Farrell

How does a dyad combine information from different members in order to arrive at a consensus judgement? One suggestion is that groups combine information in a Bayes optimal fashion: the group calculates a weighted average of individuals' estimates, with the weightings being proportional to the quality of the information each individual possesses. Alternatively, the dyad may seek to identify which member's estimate is the best, and return that as a joint judgement. These models were tested by asking members of a dyad to make private estimates of a continuous quantity (the direction of movement of a coherent motion stimulus), and to then make a joint judgement. Joint judgements were more accurate than individual judgements, but were only partly based on optimal integration. Rather, the joint judgements were often in the neighbourhood of one of the individual judgements, or an uninformed average of the two judgements. Regression analyses suggest that dyads sampled from the alternative responses according to their initial disagreement, and their relative accuracy, on a trial-by-trial basis. Rather than learning about each others' ability, dyad members appear to rely on communication of estimated precision when forming judgements, and often resolve discrepancies by taking the best guess.


2021 ◽  
Vol 5 (3) ◽  
pp. 37
Author(s):  
Nataliia Melnykova ◽  
Nataliya Shakhovska ◽  
Volodymyr Melnykov ◽  
Kateryna Melnykova ◽  
Khrystyna Lishchuk-Yakymovych

The paper describes the medical data personalization problem by determining the individual characteristics needed to predict the number of days a patient spends in a hospital. The mathematical problem of patient information analysis is formalized, which will help identify critical personal characteristics based on conditioned space analysis. The condition space is given in cube form as a reflection of the functional relationship of the general parameters to the studied object. The dataset consists of 51 instances, and ten parameters are processed using different clustering and regression models. Days in hospital is the target variable. A condition space cube is formed based on clustering analysis and features selection. In this manner, a hierarchical predictor based on clustering and an ensemble of weak regressors is built. The quality of the developed hierarchical predictor for Root Mean Squared Error metric is 1.47 times better than the best weak predictor (perceptron with 12 units in a single hidden layer).


Author(s):  
B. Carragher ◽  
M. Whittaker

Techniques for three-dimensional reconstruction of macromolecular complexes from electron micrographs have been successfully used for many years. These include methods which take advantage of the natural symmetry properties of the structure (for example helical or icosahedral) as well as those that use single axis or other tilting geometries to reconstruct from a set of projection images. These techniques have traditionally relied on a very experienced operator to manually perform the often numerous and time consuming steps required to obtain the final reconstruction. While the guidance and oversight of an experienced and critical operator will always be an essential component of these techniques, recent advances in computer technology, microprocessor controlled microscopes and the availability of high quality CCD cameras have provided the means to automate many of the individual steps.During the acquisition of data automation provides benefits not only in terms of convenience and time saving but also in circumstances where manual procedures limit the quality of the final reconstruction.


2020 ◽  
Vol 21 (1) ◽  
pp. 22-55
Author(s):  
Bartosz Czepil

The objective of this paper is an attempt to explain the determinants of the lowest governance quality level in one of the communes of the Opolskie Province, Poland. The first stage of the research consisted in developing a commune-level governance quality index in order to measure the quality of governance in the 60 communes of the Opolskie Province. Subsequently, the commune with the lowest score in the index was qualified for the second stage of the research which was based on the extreme case method. The major conclusion from the research is that the commune leader's governance style which allowed him to hold on to power for many terms of office was responsible for generating low governance quality. Furthermore, the low quality of governance was not only the effect of the governance style but also the strategy aimed at remaining in the commune leader office for many terms.


2010 ◽  
Vol 39 (2) ◽  
pp. 34-36
Author(s):  
Vaia Touna

This paper argues that the rise of what is commonly termed "personal religion" during the Classic-Hellenistic period is not the result of an inner need or even quality of the self, as often argued by those who see in ancient Greece foreshadowing of Christianity, but rather was the result of social, economic, and political conditions that made it possible for Hellenistic Greeks to redefine the perception of the individual and its relationship to others.


2017 ◽  
Vol 3 (1) ◽  
pp. 112-126 ◽  
Author(s):  
Ilaria Cristofaro

From a phenomenological perspective, the reflective quality of water has a visually dramatic impact, especially when combined with the light of celestial phenomena. However, the possible presence of water as a means for reflecting the sky is often undervalued when interpreting archaeoastronomical sites. From artificial water spaces, such as ditches, huacas and wells to natural ones such as rivers, lakes and puddles, water spaces add a layer of interacting reflections to landscapes. In the cosmological understanding of skyscapes and waterscapes, a cross-cultural metaphorical association between water spaces and the underworld is often revealed. In this research, water-skyscapes are explored through the practice of auto-ethnography and reflexive phenomenology. The mirroring of the sky in water opens up themes such as the continuity, delimitation and manipulation of sky phenomena on land: water spaces act as a continuation of the sky on earth; depending on water spaces’ spatial extension, selected celestial phenomena can be periodically reflected within architectures, so as to make the heavenly dimension easily accessible and a possible object of manipulation. Water-skyscapes appear as specular worlds, where water spaces are assumed to be doorways to the inner reality of the unconscious. The fluid properties of water have the visual effect of dissipating borders, of merging shapes, and, therefore, of dissolving identities; in the inner landscape, this process may represent symbolic death experiences and rituals of initiation, where the annihilation of the individual allows the creative process of a new life cycle. These contextually generalisable results aim to inspire new perspectives on sky-and-water related case studies and give value to the practice of reflexive phenomenology as crucial method of research.


2020 ◽  
Author(s):  
Emmanuel Kiiza Mwesiga ◽  
Noeline Nakasujja ◽  
Lawrence Nankaba ◽  
Juliet Nakku ◽  
Seggane Musisi

Introduction: Individual and group level interventions have the largest effect on outcomes in patients with the first episode of psychosis. The quality of these individual and group level interventions provided to first-episode psychosis patients in Uganda is unclear.Methods: The study was performed at Butabika National Psychiatric Teaching and referral hospital in Uganda. A retrospective chart review of recently discharged adult in-patients with the first episode of psychosis was first performed to determine the proportion of participants who received the different essential components for individual and group level interventions. From the different proportions, the quality of the services across the individual and group interventions was determined using the first-Episode Psychosis Services Fidelity Scale (FEPS-FS). The FEPS-FS assigns a grade of 1-5 on a Likert scale depending on the proportion of patients received the different components of the intervention. Results: The final sample included 156 first-episode psychosis patients. The median age was 27 years [IOR (24-36)] with 55% of participants of the female gender. 13 essential components across the individual and group interventions were assessed and their quality quantified. All 13 essential components had poor quality with the range of scores on the FEPS-FS of 1-3. Only one essential component assessed (use of single antipsychotics) had moderate quality.Discussion: Among current services at the National psychiatric hospital of Uganda, the essential for individual and group level interventions for psychotic disorders are of low quality. Further studies are required on how the quality of these interventions can be improved.


Sign in / Sign up

Export Citation Format

Share Document