mathematical rule
Recently Published Documents


TOTAL DOCUMENTS

17
(FIVE YEARS 2)

H-INDEX

3
(FIVE YEARS 0)

2021 ◽  
Author(s):  
Anca Hanea ◽  
David Peter Wilkinson ◽  
Marissa McBride ◽  
Aidan Lyon ◽  
Don van Ravenzwaaij ◽  
...  

Experts are often asked to represent their uncertainty as a subjective probability. Structured protocols offer a transparent and systematic way to elicit and combine probability judgements from multiple experts. As part of this process, experts are asked to individually estimate a probability (e.g., of a future event) which needs to be combined/aggregated into a final group prediction. The experts' judgements can be aggregated behaviourally (by striving for consensus), or mathematically (by using a mathematical rule to combine individual estimates). Mathematical rules (e.g., weighted linear combinations of judgments) provide an objective approach to aggregation. However, the choice of a rule is not straightforward, and the aggregated group probability judgement's quality depends on it. The quality of an aggregation can be defined in terms of accuracy, calibration and informativeness. These measures can be used to compare different aggregation approaches and help decide on which aggregation produces the "best" final prediction.In the ideal case, individual experts' performance (as probability assessors) is scored, these scores are translated into performance-based weights, and a performance-based weighted aggregation is used. When this is not possible though, several other aggregation methods, informed by measurable proxies for good performance, can be formulated and compared. We use several data sets to investigate the relative performance of multiple aggregation methods informed by previous experience and the available literature. Even though the accuracy, calibration, and informativeness of the majority of methods are very similar, two of the aggregation methods distinguish themselves as the best and worst.


Materials ◽  
2021 ◽  
Vol 14 (2) ◽  
pp. 387
Author(s):  
Fabrizio Bambini ◽  
Giulia Orilisi ◽  
Alessandro Quaranta ◽  
Lucia Memè

One of the current major challenges in implant therapy is to minimize marginal bone loss around implants, since it can trigger bacterial colonization of the implant’s neck, leading to its failure. The present study aimed (1) to scientifically validate a new mathematical rule based on soft tissues thickness, for choosing the correct implant position with respect to the bone level, in order to provide a better tissue adaptation to the abutment/implant surface to avoid bacterial invasion, and (2) to apply this mathematical rule to the Biological Oriented Immediate Loading (B.O.I.L.) surgical protocol, avoiding peri-implant bone resorption. N. 127 implants were inserted following B.O.I.L. protocol: implants were placed according to the mathematical rule Y = X − 3, which correlates the position of the implant from the bone crest level (Y) with the thickness of the soft tissues (X). All the implants were inserted in fresh extraction sockets, and immediately loaded with temporary abutments and prostheses. Bone levels were evaluated through radiographic examination just after surgical procedure (T0), and after 10 days (10D), 6 months (6M), 1 year (1Y), and 5 years (5Y). After 5 years, the implant survival rate was 100%, with a medium marginal bone loss around implants of 0.0704 mm (SD = 0.169 mm). One-way ANOVA, followed by Tukey’s multiple comparison test was performed for statistical evaluations (p < 0.05). This protocol provided a safe and successful procedure, with a good soft tissue seal against bacterial challenge. The application of the mathematical rule allows the implant placement in a correct vertical position from the bone crest, avoiding bone resorption and bacterial infiltrations. Moreover, the use of Multi Unit Abutment (MUA) determined a stable biological seal, favouring the implant healing and preserving the adhesion of hemidesmosomes to the titanium of MUA.


2020 ◽  
Author(s):  
Raunak Pillai ◽  
Abbey Loehr ◽  
Darren J. Yeo ◽  
Min Kyung Hong ◽  
Lisa Fazio

Using incorrect worked examples during mathematics instruction can improve student learning. However, teachers worry that students may confuse correct and incorrect examples over time, and memory research supports this fear. To examine if this forgetting occurs, we had undergraduates rate the correctness of correct and incorrect worked examples immediately and one week later (Experiment 1). Previously studied incorrect examples were rated as slightly more correct after the delay, but this did not affect ratings of unstudied examples or problem-solving accuracy. In Experiment 2, we more closely mimicked how incorrect worked examples are used in classroom settings. Again, we found only small changes in students’ memory for studied worked examples after the delay, and no changes for unstudied examples or problem-solving accuracy. Our findings suggest the costs of teaching with incorrect worked examples are limited to the specific studied problems, and do not affect learning of the underlying mathematical rule.


2020 ◽  
Vol 3 (2) ◽  
pp. 01-12
Author(s):  
Arne Torbjørn Høstmark ◽  
Anna Haug

Body fatty acids are important in health and disease. We previously observed two groups of fatty acids in breast muscle of chickens: Group 1) with relative amounts correlating negatively with %AA (20:4 n6), and Group 2) with relative amounts correlating positively with %AA. Within each of the two groups, we here found positive correlations between fatty acid percentages. Accordingly, Group 1 percentages correlated negatively with those of Group 2. With random numbers in lieu of the true values of Group 2 fatty acids, we were able to reproduce the positive correlations found with true values, if the random numbers were generated with the true ranges. In contrast, with random numbers we did not succeed in reproducing all of the negative correlations between Group 1 and Group 2 fatty acid percentages. We then observed that absolute amounts (g/kg) of fatty acids in Group 1 correlated positively and strongly (r > 0.9), suggesting a coordinated regulation of these fatty acids. Thus, Group 1 fatty acids seemed to be a cluster of fatty acids. Random number cluster percentage showed nice inverse associations with random number Group 2 fatty acid percentages, like the outcome observed with the true values. We suggest that associations between fatty acid percentages are caused by their concentration distributions, and by cluster regulation. Distribution Dependent and Cluster Regulation could be an evolutionary adaptation, where a mathematical rule is utilized to e.g. balance effects of eicosanoids/docosanoids, and possibly other metabolites.


2020 ◽  
Vol 12 (1) ◽  
pp. 363-375
Author(s):  
Mohamed A. Rashed ◽  
Ali H. Atef

AbstractVirtual resolution enhancement (VRE) is a new poststack cosmetic tool that can be applied to different types of seismic data. VRE emphasizes major reflections, enhances temporal resolution of seismic events, and suppresses reverberation noise, leading to better visualization of the entire seismic data. VRE is based on simple mathematical rule, and its parameters can be tweaked to suite the vast variety of seismic data available today. Although VRE does not reveal new or hidden features on seismic section, it significantly enhances the existing ones, which improves the interpretation and assists automatic horizon picking process. The only disadvantage of VRE is the long computational time. However, given the giant advances in the computational power and speed expected in the near future, this problem should be negligible. Tests conducted on seismic sections, collected from different regions in the world and went through different data acquisition and processing routines, prove the effectiveness of the VRE procedure.


2019 ◽  
Vol 33 (11) ◽  
pp. 876-887 ◽  
Author(s):  
Robinson Kundert ◽  
Jeff Goldsmith ◽  
Janne M. Veerbeek ◽  
John W. Krakauer ◽  
Andreas R. Luft

In 2008, it was proposed that the magnitude of recovery from nonsevere upper limb motor impairment over the first 3 to 6 months after stroke, measured with the Fugl-Meyer Assessment (FMA), is approximately 0.7 times the initial impairment (“proportional recovery”). In contrast to patients with nonsevere hemiparesis, about 30% of patients with an initial severe paresis do not show such recovery (“nonrecoverers”). Hence it was suggested that the proportional recovery rule (PRR) was a manifestation of a spontaneous mechanism that is present in all patients with mild-to-moderate paresis but only in some with severe paresis. Since the introduction of the PRR, it has subsequently been applied to other motor and nonmotor impairments. This more general investigation of the PRR has led to inconsistencies in its formulation and application, making it difficult to draw conclusions across studies and precipitating some cogent criticism. Here, we conduct a detailed comparison of the different studies reporting proportional recovery and, where appropriate, critique statistical methodology. On balance, we conclude that existing data in aggregate are largely consistent with the PRR as a population-level model for upper limb motor recovery; recent reports of its demise are exaggerated, as these excessively focus on the less conclusive issue of individual subject-level predictions. Moving forward, we suggest that methodological caution and new analytical approaches will be needed to confirm (or refute) a systematic character to spontaneous recovery from motor and other poststroke impairments, which can be captured by a mathematical rule either at the population or at the subject level.


2019 ◽  
Vol 19 (06) ◽  
pp. 1950054 ◽  
Author(s):  
Ileana Corbi ◽  
Ottavia Corbi ◽  
Haitao Li

In the paper, the dynamic control of structural vibrations in ancient constructions is considered. Since the constituent material of the structure exhibits poor tensile resistance, an effective performance of the algorithm requires the nonlinearity of the structure to be considered at the very first stage of the design of the control algorithm. Inclusion of the material nonlinearity of the structure into the design of the control mathematical rule presents a number of issues of high complexity either from the theoretical or computational aspects, which make the topic poorly treated in literature, in particular through full mathematical approaches. In the first part of this paper, one sets up a full model for analysis of the spatial masonry constructions with holonomic plasticity taken into account, and, in the second part, one formulates an algorithm aimed at mitigating the dynamic effects on the structure. The implementation of the control algorithm, the development of original calculus codes and the numerical investigations show the high potential of the proposed approach in significantly improving the performance of the structure.


2018 ◽  
Vol 28 (11) ◽  
pp. 1830038 ◽  
Author(s):  
Cesar Manchein ◽  
Holokx A. Albuquerque ◽  
Luis Fernando Mello

We study the dynamics and characterize the bifurcation structure of a phase-locked loop (PLL) device modeled by a third-order autonomous differential equation with sinusoidal phase detector. The development of this work was performed using rigorous analysis and numerical experiments. Through theoretical analysis the bifurcation structures related to two fundamental equilibrium points of the system are described. By using extensive numerical experiments we investigate the intricate organization between periodic and chaotic domains in parameter space (named here parameter plane as the PLL model has only two control parameters) and obtain two following remarkable findings: (i) there are self-organized generic stable periodic structures along specific directions in parameter plane, whose periods are defined by a mathematical rule and, (ii) the existence of transient chaos phenomenon responsible for long chaotic temporal evolution preceding the asymptotic (periodic) dynamics for some particular control parameter pairs is characterized. Our theoretical and numerical results present an astonishing concordance. We believe that the present study, specially the parameter plane analysis, may have a great importance to experimental studies and general applications involving PLL devices when, for example, one would like to avoid the chaotic regimes.


2018 ◽  
Author(s):  
Thomas P. Quinn ◽  
Thin Nguyen ◽  
Samuel C. Lee ◽  
Svetha Venkatesh

AbstractSince the turn of the century, researchers have sought to diagnose cancer based on gene expression signatures measured from the blood or biopsy as biomarkers. This task, known as classification, is typically solved using a suite of algorithms that learn a mathematical rule capable of discriminating one group (e.g., cases) from another (e.g., controls). However, discriminatory methods can only identify cancerous samples that resemble those that the algorithm already saw during training. As such, we argue that discriminatory methods are fundamentally ill-suited for the classification of cancer: because the possibility space of cancer is definitively large, the existence of a one-of-a-kind gene expression signature becomes very likely. Instead, we propose using an established surveillance method that detects anomalous samples based on their deviation from a learned normal steady-state structure. By transferring this method to transcriptomic data, we can create an anomaly detector for tissue transcriptomes, a “tissue detector”, that is capable of identifying cancer without ever seeing a single cancer example. Using models trained on normal GTEx samples, we show that our “tissue detector” can accurately classify TCGA samples as normal or cancerous and that its performance is further improved by including more normal samples in the training set. We conclude this report by emphasizing the conceptual advantages of anomaly detection and by highlighting future directions for this field of study.


Sign in / Sign up

Export Citation Format

Share Document