scholarly journals A Data Structure to Handle Large Sets of Equal Terms

10.29007/hsbm ◽  
2018 ◽  
Author(s):  
Baudouin Le Charlier ◽  
Mêton Mêton Atindehou

We present a data structure to represent and manipulate large sets of (equal) terms (or expressions). Our initial and main motivation for this data structure is the simplification of expressions with respect to a formal theory, typically, an equational one. However, it happens that the data structure is also efficient to compute the congruence closure of a relation over a set of terms.We provide an abstract definition of the data structure, including a precise semantics, and we explain how to implement it efficiently. We prove the correctness of the proposed algorithms, with a complexity analysis and experimental results. We compare these algorithms with previous algorithms to compute the congruence closure and we also sketch how we use the data structure to tackle the expression simplification problem.

Author(s):  
Juan de Lara ◽  
Esther Guerra

AbstractModelling is an essential activity in software engineering. It typically involves two meta-levels: one includes meta-models that describe modelling languages, and the other contains models built by instantiating those meta-models. Multi-level modelling generalizes this approach by allowing models to span an arbitrary number of meta-levels. A scenario that profits from multi-level modelling is the definition of language families that can be specialized (e.g., for different domains) by successive refinements at subsequent meta-levels, hence promoting language reuse. This enables an open set of variability options given by all possible specializations of the language family. However, multi-level modelling lacks the ability to express closed variability regarding the availability of language primitives or the possibility to opt between alternative primitive realizations. This limits the reuse opportunities of a language family. To improve this situation, we propose a novel combination of product lines with multi-level modelling to cover both open and closed variability. Our proposal is backed by a formal theory that guarantees correctness, enables top-down and bottom-up language variability design, and is implemented atop the MetaDepth multi-level modelling tool.


1966 ◽  
Vol 1 (4) ◽  
pp. 331-338 ◽  
Author(s):  
T C Hsu

Three different definitions of the yield point have been used in experimental work on the yield locus: proportional limit, proof strain and the ‘yield point’ by backward extrapolation. The theoretical implications of the ‘yield point’ by backward extrapolation are examined in an analysis of the loading and re-loading stress paths. It is shown, in connection with experimental results by Miastkowski and Szczepinski, that the proportional limit found by inspection is in fact a point located by backward extrapolation based on a small section of the stress-strain curve, near the elastic portion of the curve. The effect of different definitions of the yield point on the shape of the yield locus and some considerations for the choice between them are discussed.


Author(s):  
K. L. Chalasani ◽  
B. Grogan ◽  
A. Bagchi ◽  
C. C. Jara-Almonte ◽  
A. A. Ogale ◽  
...  

Abstract Rapid Prototyping (RP) processes reduce the time consumed in the manufacture of a prototype by producing parts directly from a CAD representation, without tooling. The StereoLithography Apparatus (SLA), and most other recent RP processes build a 3-D object from 2.5-D layers. Slicing is the process of defining layers to be built by the system. In this paper a framework is proposed for the development of algorithms for the representation and definition of layers for use in the SLA, with a view to determine if the slicing algorithms will affect surface finish in any significant manner. Currently, it is not possible to automatically vary slice thicknesses within the same object, using the existent algorithm. Also, it would be useful to use a dense grid for hatching or skin filling any given layer, or to change the hatch-pattern if desired. In addition, simulation of the layered building process would be helpful, so that the user can prespecify parameters that need to be varied during the process. The proposed framework incorporates these and other features. Two approaches for determining contours on each slice are suggested and their implementation is discussed. In the first, the layers are defined by the intersections of a plane with the surfaces defining the object. The plane is moved up from the base of the object as it is being built in increments. All intersections found are stored in a data structure, and sorted in head to tail fashion to define a contour for all closed areas on a layer. The second approach uses a scanline-type search to look for an intersection that will trigger a contour-tracing procedure. The contour-tracer is invoked whenever an unused edge is found in the search. This saves storage and sorting times, because the contour is determined as a chain of edges, in cyclic order. It is envisaged that results of this work on the SLA can be applied to other RP processes entailing layered building.


2013 ◽  
Vol 357-360 ◽  
pp. 2353-2357 ◽  
Author(s):  
Xu Dong He ◽  
Yuan Li Wang ◽  
Yuan Yuan Zhang

We attempt to study the participants behavioral risks of the complex project based on the theory of complexity analysis in this paper. First of all, through the review of literature we put forward the research contents of the complexity theory. Then, discuss the complexity characteristics of the project management; focus on the definition of the project complexity, the nonlinear characteristics of the project organization, the emergence and uncertainty characteristics of the complex project, etc. Finally, we point out that both from the perspective of theoretical research and project practice, the complexity analysis of project management has a positive significance.


2020 ◽  
Author(s):  
Valeria Raparelli Raparelli ◽  
Colleen M. Norris ◽  
Uri Bender ◽  
Maria Trinidad Herrero ◽  
Alexandra Kautzky-Willer ◽  
...  

Abstract Background: Gender refers to the socially constructed roles, behaviors, expressions, and identities of girls, women, boys, men, and gender diverse people. It influences self-perception, individual’s actions and interactions, as well as the distribution of power and resources in society. Gender-related factors are seldom assessed as determinants of health outcomes, despite their powerful contribution.Methods: Investigators of the GOING-FWD project developed a standard methodology applicable for observational studies to retrospectively identify gender-related factors to assess their relationship to outcomes and applied this method to selected cohorts of non-communicable chronic diseases from Austria, Canada, Spain, Sweden.Results: The following multistep process was applied. Step 1 (Identification of Gender-related Variables): Based on the gender framework of the Women Health Research Network (i.e. gender identity, role, relations, and institutionalized gender), and available literature for a certain disease, an optimal “wish-list” of gender-related variables/factors was created and discussed by experts. Step 2 (Definition of Outcomes): each of the cohort data dictionaries were screened for clinical and patient relevant outcomes, using the ICHOM framework. Step 3 (Building of Feasible Final List): A cross-validation between gender-related and outcome variables available per database and the “wish-list” was performed. Step 4 (Retrospective Data Harmonization): The harmonization potential of variables was evaluated. Step 5 (Definition of Data Structure and Analysis): Depending on the database data structure, the following analytic strategies were identified: (1) local analysis of data not transferable followed by a meta-analysis combining study-level estimates; (2) centrally performed federated analysis of anonymized data, with the individual-level participant data remaining on local servers; (3) synthesizing the data locally and performing a pooled analysis on the synthetic data; and (4) central analysis of pooled transferable data.Conclusion: The application of the GOING-FWD systematic multistep approach can help guide investigators to analyze gender and its impact on outcomes in previously collected data.


Author(s):  
Meghyn Bienvenu ◽  
Camille Bourgaux

In this paper, we explore the issue of inconsistency handling over prioritized knowledge bases (KBs), which consist of an ontology, a set of facts, and a priority relation between conflicting facts. In the database setting, a closely related scenario has been studied and led to the definition of three different notions of optimal repairs (global, Pareto, and completion) of a prioritized inconsistent database. After transferring the notions of globally-, Pareto- and completion-optimal repairs to our setting, we study the data complexity of the core reasoning tasks: query entailment under inconsistency-tolerant semantics based upon optimal repairs, existence of a unique optimal repair, and enumeration of all optimal repairs. Our results provide a nearly complete picture of the data complexity of these tasks for ontologies formulated in common DL-Lite dialects. The second contribution of our work is to clarify the relationship between optimal repairs and different notions of extensions for (set-based) argumentation frameworks. Among our results, we show that Pareto-optimal repairs correspond precisely to stable extensions (and often also to preferred extensions), and we propose a novel semantics for prioritized KBs which is inspired by grounded extensions and enjoys favourable computational properties. Our study also yields some results of independent interest concerning preference-based argumentation frameworks.


Author(s):  
Tiep Le ◽  
Tran Cao Son ◽  
Enrico Pontelli

This paper proposes Multi-context System for Optimization Problems (MCS-OP) by introducing conditional costassignment bridge rules to Multi-context Systems (MCS). This novel feature facilitates the definition of a preorder among equilibria, based on the total incurred cost of applied bridge rules. As an application of MCS-OP, the paper describes how MCS-OP can be used in modeling Distributed Constraint Optimization Problems (DCOP), a prominent class of distributed optimization problems that is frequently employed in multi-agent system (MAS) research. The paper shows, by means of an example, that MCS-OP is more expressive than DCOP, and hence, could potentially be useful in modeling distributed optimization problems which cannot be easily dealt with using DCOPs. It also contains a complexity analysis of MCS-OP.


1991 ◽  
Vol 113 (2) ◽  
pp. 122-126
Author(s):  
M. De Lucia

The effects of using oxygen to partially or wholly replace fuel air in small-size melting furnaces were studied over a range of application fields. Following definition of the useful parameters, testing was conducted on furnaces for melting glass, ferrous metals (pigiron), nonferrous metals (copper alloys), and ceramic materials. In all cases, oxygen-enrichment was found to provide significant energy savings, as well as notable advantages in terms of both plant output and energy consumption.


2021 ◽  
pp. 57-73
Author(s):  
Talia Dan-Cohen

This chapter focuses on ambiguous experimental results, paying close attention to experimental processes and tracking the ways that practitioners tackle, reason, and think through puzzling experimental results. It investigates the context of experiments with modified life-forms and experimental results that take the form of a vast array of biotic not-quites. It also highlights organismic by-products that point in various directions when it comes to figuring out how much control synthetic biologists have over their designs and what steps should be taken as correctives. The chapter explains how experiments often come packaged together with the choices, standards, and observational skills of others. It discusses the problem of the definition of growth that was deferred through the delineation of a category for the indeterminate results.


Sign in / Sign up

Export Citation Format

Share Document