A Methodology for the Connecting Rod Lubrication Optimization

Author(s):  
Arthur Francisco ◽  
Thomas Lavie ◽  
Bernard Villechaise

The numerical optimization of the connecting rod big-end lubrication involves several main steps. The first, which can be considered as the most important one, is the identification of the main input factors and their varying range. In the same time, the meaningful resulting values have to be identified because the optimization will be performed on the basis of this choice. The computing time for a TEHD solution prevents from performing a huge amount of calculi to draw the Pareto-Front of the solutions. Thus, the next step is the creation of a metamodel, based on polynomials, with a good predictability property and a low computing cost. In the third step, a fast Multi Objective Optimization is performed on the metamodel. The Pareto-Front, which represents the best trade-offs of solutions, is identified: one can now easily choose the input parameters which will give a particular desired solution. In the last step, the robustness of the solutions has to be checked: if a given solution is chosen, the corresponding input parameters have to enclose a minimal uncertainty gap to be realistic. Otherwise, this wanted solution will never be reached, because in a real-life problem, the parameter values are not deterministic.

2016 ◽  
Vol 13 (122) ◽  
pp. 20160543 ◽  
Author(s):  
Mark N. Read ◽  
Kieran Alden ◽  
Louis M. Rose ◽  
Jon Timmis

Computational agent-based simulation (ABS) is increasingly used to complement laboratory techniques in advancing our understanding of biological systems. Calibration, the identification of parameter values that align simulation with biological behaviours, becomes challenging as increasingly complex biological domains are simulated. Complex domains cannot be characterized by single metrics alone, rendering simulation calibration a fundamentally multi-metric optimization problem that typical calibration techniques cannot handle. Yet calibration is an essential activity in simulation-based science; the baseline calibration forms a control for subsequent experimentation and hence is fundamental in the interpretation of results. Here, we develop and showcase a method, built around multi-objective optimization, for calibrating ABSs against complex target behaviours requiring several metrics (termed objectives ) to characterize. Multi-objective calibration (MOC) delivers those sets of parameter values representing optimal trade-offs in simulation performance against each metric, in the form of a Pareto front. We use MOC to calibrate a well-understood immunological simulation against both established a priori and previously unestablished target behaviours. Furthermore, we show that simulation-borne conclusions are broadly, but not entirely, robust to adopting baseline parameter values from different extremes of the Pareto front, highlighting the importance of MOC's identification of numerous calibration solutions. We devise a method for detecting overfitting in a multi-objective context, not previously possible, used to save computational effort by terminating MOC when no improved solutions will be found. MOC can significantly impact biological simulation, adding rigour to and speeding up an otherwise time-consuming calibration process and highlighting inappropriate biological capture by simulations that cannot be well calibrated. As such, it produces more accurate simulations that generate more informative biological predictions.


Author(s):  
G. Kalyani ◽  
K. Krishna Jyothi ◽  
T. Pratyusha

Most real-life optimization problems involve multiple objective functions. Finding  a  solution  that  satisfies  the  decision-maker  is  very  difficult  owing  to  conflict  between  the  objectives.  Furthermore,  the  solution  depends  on  the  decision-maker’s preference.  Metaheuristic solution methods have become common tools to solve these problems.  The  task  of  obtaining  solutions  that  take  account  of  a  decision-maker’s preference  is  at  the  forefront  of  current  research.  It  is  also  possible  to  have  multiple decision-makers with different preferences and with different  decision-making  powers. It may not be easy to express a preference using crisp numbers. In this study, the preferences of multiple decision-makers were simulated  and  a solution based on  a genetic  algorithm was  developed  to  solve  multi-objective  optimization  problems.  The  preferences  were collected  as  fuzzy  conditional  trade-offs  and  they  were  updated  while  running  the algorithm interactively with the decision-makers. The proposed method was tested using well-known benchmark problems.  The solutions were found to converge around the Pareto front of the problems.


Energies ◽  
2018 ◽  
Vol 11 (9) ◽  
pp. 2190 ◽  
Author(s):  
Rafael Dawid ◽  
David McMillan ◽  
Matthew Revie

This paper for the first time captures the impact of uncertain maintenance action times on vessel routing for realistic offshore wind farm problems. A novel methodology is presented to incorporate uncertainties, e.g., on the expected maintenance duration, into the decision-making process. Users specify the extent to which these unknown elements impact the suggested vessel routing strategy. If uncertainties are present, the tool outputs multiple vessel routing policies with varying likelihoods of success. To demonstrate the tool’s capabilities, two case studies were presented. Firstly, simulations based on synthetic data illustrate that in a scenario with uncertainties, the cost-optimal solution is not necessarily the best choice for operators. Including uncertainties when calculating the vessel routing policy led to a 14% increase in the number of wind turbines maintained at the end of the day. Secondly, the tool was applied to a real-life scenario based on an offshore wind farm in collaboration with a United Kingdom (UK) operator. The results showed that the assignment of vessels to turbines generated by the tool matched the policy chosen by wind farm operators. By producing a range of policies for consideration, this tool provided operators with a structured and transparent method to assess trade-offs and justify decisions.


2018 ◽  
Vol 54 (1) ◽  
pp. 3-17 ◽  
Author(s):  
Andrea Szalavetz

Abstract Despite a consensus view in the literature about the importance of cross-functional collaboration (CFC) for corporate environmental performance improvement, there is a dearth of studies that explain how exactly sustainability-oriented CFC can foster this objective. The purpose of this paper is to explain the role of CFC in corporate environmental performance improvement. We do this by undertaking two rounds of literature review, developing a proposition after the first round and by collecting illuminative real-life examples that illustrate our arguments in the second round. We propose and illustrate that CFC can effectively address two systemic properties of corporate environmental performance: trade-offs and interdependencies among different aspects of corporate environmental sustainability. If left unaddressed, these systemic specifics would result in organizational, managerial, and behavioral outcomes, such as inertia, opposition to change, lack of information, and so on, which would turn into effective barriers to corporate environmental performance improvement. put CFC addresses these barriers through information sharing, knowledge building, and interest reconciliation.


Author(s):  
Tianxiang Liu ◽  
Li Mao ◽  
Mats-Erik Pistol ◽  
Craig Pryor

Abstract Calculating the electronic structure of systems involving very different length scales presents a challenge. Empirical atomistic descriptions such as pseudopotentials or tight-binding models allow one to calculate the effects of atomic placements, but the computational burden increases rapidly with the size of the system, limiting the ability to treat weakly bound extended electronic states. Here we propose a new method to connect atomistic and quasi-continuous models, thus speeding up tight-binding calculations for large systems. We divide a structure into blocks consisting of several unit cells which we diagonalize individually. We then construct a tight-binding Hamiltonian for the full structure using a truncated basis for the blocks, ignoring states having large energy eigenvalues and retaining states with an energy close to the band edge energies. A numerical test using a GaAs/AlAs quantum well shows the computation time can be decreased to less than 5% of the full calculation with errors of less than 1%. We give data for the trade-offs between computing time and loss of accuracy. We also tested calculations of the density of states for a GaAs/AlAs quantum well and find a ten times speedup without much loss in accuracy.


2021 ◽  
Vol 50 (2) ◽  
pp. 16-37
Author(s):  
Valentin Todorov

In a number of recent articles Riani, Cerioli, Atkinson and others advocate the technique of monitoring robust estimates computed over a range of key parameter values. Through this approach the diagnostic tools of choice can be tuned in such a way that highly robust estimators which are as efficient as possible are obtained. This approach is applicable to various robust multivariate estimates like S- and MM-estimates, MVE and MCD as well as to the Forward Search in whichmonitoring is part of the robust method. Key tool for detection of multivariate outliers and for monitoring of robust estimates is the Mahalanobis distances and statistics related to these distances. However, the results obtained with thistool in case of compositional data might be unrealistic since compositional data contain relative rather than absolute information and need to be transformed to the usual Euclidean geometry before the standard statistical tools can be applied. Various data transformations of compositional data have been introduced in the literature and theoretical results on the equivalence of the additive, the centered, and the isometric logratio transformation in the context of outlier identification exist. To illustrate the problem of monitoring compositional data and to demonstrate the usefulness of monitoring in this case we start with a simple example and then analyze a real life data set presenting the technologicalstructure of manufactured exports. The analysis is conducted with the R package fsdaR, which makes the analytical and graphical tools provided in the MATLAB FSDA library available for R users.


Author(s):  
Yoshifumi Mori ◽  
Takashi Saito ◽  
Yu Mizobe

We focused on vibration characteristics of reciprocating compressors and constructed the mathematical model to calculate the natural frequencies and modes for crank angles and proposed a method to estimate the degree and the suspicious portion of failure by difference of temporal parameter values obtained using measuring data in operation and the mathematical model. In this paper, according to the proposed method, a case study is carried out using the field data, where the data were acquired before and after the failures occurred in the connecting parts of connecting rod, to prospect the difference between each parameter value for two operating states. Inspecting resonant characteristics each in the frequency response data relating to the natural frequencies for bending modes of the piston rod, we determined two resonant frequencies, which could correspond to the 1st and 2nd mode about bending of the piston rod. To equate the calculated each natural frequency from eigen value analysis based on the proposed model with each resonant frequency, we define the error function for the identified problem, namely optimum problem. In the identified results, it is found that some parameter values have much difference and the corresponding failure could occur around the connecting rod. We could show the possibility to detect both the change of the parameter values and the deterioration parts for two different kinds of the operating states by our proposed method.


2021 ◽  
Author(s):  
Carlo Cristiano Stabile ◽  
Marco Barbiero ◽  
Giorgio Fighera ◽  
Laura Dovera

Abstract Optimizing well locations for a green field is critical to mitigate development risks. Performing such workflows with reservoir simulations is very challenging due to the huge computational cost. Proxy models can instead provide accurate estimates at a fraction of the computing time. This study presents an application of new generation functional proxies to optimize the well locations in a real oil field with respect to the actualized oil production on all the different geological realizations. Proxies are built with the Universal Trace Kriging and are functional in time allowing to actualize oil flows over the asset lifetime. Proxies are trained on the reservoir simulations using randomly sampled well locations. Two proxies are created for a pessimistic model (P10) and a mid-case model (P50) to capture the geological uncertainties. The optimization step uses the Non-dominated Sorting Genetic Algorithm, with discounted oil productions of the two proxies, as objective functions. An adaptive approach was employed: optimized points found from a first optimization were used to re-train the proxy models and a second run of optimization was performed. The methodology was applied on a real oil reservoir to optimize the location of four vertical production wells and compared against reference locations. 111 geological realizations were available, in which one relevant uncertainty is the presence of possible compartments. The decision space represented by the horizontal translation vectors for each well was sampled using Plackett-Burman and Latin-Hypercube designs. A first application produced a proxy with poor predictive quality. Redrawing the areas to avoid overlaps and to confine the decision space of each well in one compartment, improved the quality. This suggests that the proxy predictive ability deteriorates in presence of highly non-linear responses caused by sealing faults or by well interchanging positions. We then followed a 2-step adaptive approach: a first optimization was performed and the resulting Pareto front was validated with reservoir simulations; to further improve the proxy quality in this region of the decision space, the validated Pareto front points were added to the initial dataset to retrain the proxy and consequently rerun the optimization. The final well locations were validated on all 111 realizations with reservoir simulations and resulted in an overall increase of the discounted production of about 5% compared to the reference development strategy. The adaptive approach, combined with functional proxy, proved to be successful in improving the workflow by purposefully increasing the training set samples with data points able to enhance the optimization step effectiveness. Each optimization run performed relies on about 1 million proxy evaluations which required negligible computational time. The same workflow carried out with standard reservoir simulations would have been practically unfeasible.


Author(s):  
Mari Aino Hukkalainen ◽  
Krzysztof Klobut ◽  
Tarja Mäkeläinen ◽  
Vanda Dimitriou ◽  
Dariusz Leszczyński

Practical guidelines are presented for improved process for design and retrofitting of energy-efficient buildings, with an aim to integrate buildings better with the neighbourhood energy system, among others through energy matching. The chapter describes the role of energy simulations in an integrated building retrofitting process and how to select technologies for the retrofitting toward nearly zero energy building level. Feasibility of performing a holistic analysis of retrofitting options can be increased through the integration of BIM, well populated, and linked databases and a multi-criteria decision-making approach. Multiple-criteria decision-making methods aid taking into account a number of building energy performance and user-preference-related criteria and the trade-offs between the different criteria for each retrofitting option. The real-life viewpoints and benefits of utilising the developed methods and processes are discussed, especially from the Eastern European view.


2018 ◽  
Vol 10 (11) ◽  
pp. 4001 ◽  
Author(s):  
Johanna Jacobi ◽  
Aymara Llanque

Our global food system is characterized by an increasing concentration and imbalance of power, with trade-offs between hunger, inequality, unsustainable production and consumption, and profit. A systematic analysis of power imbalances in food systems is required if we are to meet the 2030 Agenda vision of promoting sustainable production and consumption patterns and ending hunger and poverty. Such an analysis, with a view to a transformation to more sustainable and just food systems, requires tools to be developed and tested in real-life case studies of food systems. To better understand the structures and mechanisms around power in food systems, this study applies a political ecology lens. We adapted the “power cube” analysis framework that was proposed by the Institute of Development Studies for the analysis of spaces, forms, and levels of power. We apply the analysis of these three dimensions of power to two food systems in the tropical lowlands of Bolivia: one agroindustrial and one indigenous. After identifying food system actors, the food system spaces in which they interact, and what forms of power they use at what levels, we discuss some implications for an emerging scientific culture of power analyses in critical sustainability assessments. Mechanisms of hidden power undermine visible legislative power in both case studies, but in our example of an indigenous food system of the Guaraní people, visible power stays with a local community through their legally recognized and communally owned and governed territory, with important implications for the realization of the right to food.


Sign in / Sign up

Export Citation Format

Share Document