scholarly journals Variable Fidelity Modeling as Applied to Trajectory Optimization for a Hydraulic Backhoe

Author(s):  
Roxanne A. Moore ◽  
Christiaan J. J. Paredis

Modeling, simulation, and optimization play vital roles throughout the engineering design process; however, in many design disciplines the cost of simulation is high, and designers are faced with a tradeoff between the number of alternatives that can be evaluated and the accuracy with which they are evaluated. In this paper, a methodology is presented for using models of various levels of fidelity during the optimization process. The intent is to use inexpensive, low-fidelity models with limited accuracy to recognize poor design alternatives and reserve the high-fidelity, accurate, but also expensive models only to characterize the best alternatives. Specifically, by setting a user-defined performance threshold, the optimizer can explore the design space using a low-fidelity model by default, and switch to a higher fidelity model only if the performance threshold is attained. In this manner, the high fidelity model is used only to discern the best solution from the set of good solutions, so computational resources are conserved until the optimizer is close to the solution. This makes the optimization process more efficient without sacrificing the quality of the solution. The method is illustrated by optimizing the trajectory of a hydraulic backhoe. To characterize the robustness and efficiency of the method, a design space exploration is performed using both the low and high fidelity models, and the optimization problem is solved multiple times using the variable fidelity framework.

2014 ◽  
Vol 27 (2) ◽  
pp. 235-249 ◽  
Author(s):  
Anirban Sengupta ◽  
Reza Sedaghat ◽  
Vipul Mishra

Design space exploration is an indispensable segment of High Level Synthesis (HLS) design of hardware accelerators. This paper presents a novel technique for Area-Execution time tradeoff using residual load decoding heuristics in genetic algorithms (GA) for integrated design space exploration (DSE) of scheduling and allocation. This approach is also able to resolve issues encountered during DSE of data paths for hardware accelerators, such as accuracy of the solution found, as well as the total exploration time during the process. The integrated solution found by the proposed approach satisfies the user specified constraints of hardware area and total execution time (not just latency), while at the same time offers a twofold unified solution of chaining based schedule and allocation. The cost function proposed in the genetic algorithm approach takes into account the functional units, multiplexers and demultiplexers needed during implementation. The proposed exploration system (ExpSys) was tested on a large number of benchmarks drawn from the literature for assessment of its efficiency. Results indicate an average improvement in Quality of Results (QoR) greater than 26% when compared to a recent well known GA based exploration method.


Author(s):  
Matthew A. Williams ◽  
Andrew G. Alleyne

In the early stages of control system development, designers often require multiple iterations for purposes of validating control designs in simulation. This has the potential to make high fidelity models undesirable due to increased computational complexity and time required for simulation. As a solution, lower fidelity or simplified models are used for initial designs before controllers are tested on higher fidelity models. In the event that unmodeled dynamics cause the controller to fail when applied on a higher fidelity model, an iterative approach involving designing and validating a controller’s performance may be required. In this paper, a switched-fidelity modeling formulation for closed loop dynamical systems is proposed to reduce computational effort while maintaining elevated accuracy levels of system outputs and control inputs. The effects on computational effort and accuracy are investigated by applying the formulation to a traditional vapor compression system with high and low fidelity models of the evaporator and condenser. This sample case showed the ability of the switched fidelity framework to closely match the outputs and inputs of the high fidelity model while decreasing computational cost by 32% from the high fidelity model. For contrast, the low fidelity model decreases computational cost by 48% relative to the high fidelity model.


Author(s):  
Gilberto Meji´a Rodri´guez ◽  
John E. Renaud ◽  
Vikas Tomar

Research applications involving design tool development for multiple phase material design are at an early stage of development. The computational requirements of advanced numerical tools for simulating material behavior such as the finite element method (FEM) and the molecular dynamics method (MD) can prohibit direct integration of these tools in a design optimization procedure where multiple iterations are required. The complexity of multiphase material behavior at multiple scales restricts the development of a comprehensive meta-model that can be used to replace the multiscale analysis. One, therefore, requires a design approach that can incorporate multiple simulations (multi-physics) of varying fidelity such as FEM and MD in an iterative model management framework that can significantly reduce design cycle times. In this research a material design tool based on a variable fidelity model management framework is presented. In the variable fidelity material design tool, complex “high fidelity” FEM analyses are performed only to guide the analytic “low-fidelity” model toward the optimal material design. The tool is applied to obtain the optimal distribution of a second phase, consisting of silicon carbide (SiC) fibers, in a silicon-nitride (Si3N4) matrix to obtain continuous fiber SiC-Si3N4 ceramic composites (CFCCs) with optimal fracture toughness. Using the variable fidelity material design tool in application to one test problem, a reduction in design cycle time around 80 percent is achieved as compared to using a conventional design optimization approach that exclusively calls the high fidelity FEM.


Author(s):  
Xin Zhao ◽  
Smruti Sahoo ◽  
Konstantinos Kyprianidis ◽  
Sharmila Sumsurooah ◽  
Giorgio Valente ◽  
...  

Abstract To achieve the goals of substantial improvements in efficiency and emissions set by Flightpath 2050, fundamentally different concepts are required. As one of the most promising solutions, electrification of the aircraft primary propulsion is currently a prime focus of research and development. Unconventional propulsion sub-systems, mainly the electrical power system, associated thermal management system and transmission system, provide a variety of options for integration in the existing propulsion systems. Different combinations of the gas turbine and the unconventional propulsion sub-systems introduce different configurations and operation control strategies. The trade-off between the use of the two energy sources, jet fuel and electrical energy, is primarily a result of the trade-offs between efficiencies and sizing characteristics of these sub-systems. The aircraft structure and performance are the final carrier of these trade-offs. Hence, full design space exploration of various hybrid derivatives requires global investigation of the entire aircraft considering these key propulsion sub-systems and the aircraft structure and performance, as well as their interactions. This paper presents a recent contribution of the development for a physics-based simulation and optimization platform for hybrid electric aircraft conceptual design. Modeling of each subsystem and the aircraft structure are described as well as the aircraft performance modeling and integration technique. With a focus on the key propulsion sub-systems, aircraft structure and performance that interfaces with existing conceptual design frameworks, this platform aims at full design space exploration of various hybrid concepts at a low TRL level.


2008 ◽  
Vol 130 (9) ◽  
Author(s):  
Gilberto Mejía-Rodríguez ◽  
John E. Renaud ◽  
Vikas Tomar

Research applications involving design tool development for multi phase material design are at an early stage of development. The computational requirements of advanced numerical tools for simulating material behavior such as the finite element method (FEM) and the molecular dynamics (MD) method can prohibit direct integration of these tools in a design optimization procedure where multiple iterations are required. One, therefore, requires a design approach that can incorporate multiple simulations (multiphysics) of varying fidelity such as FEM and MD in an iterative model management framework that can significantly reduce design cycle times. In this research a material design tool based on a variable fidelity model management framework is presented. In the variable fidelity material design tool, complex “high-fidelity” FEM analyses are performed only to guide the analytic “low-fidelity” model toward the optimal material design. The tool is applied to obtain the optimal distribution of a second phase, consisting of silicon carbide (SiC) fibers, in a silicon-nitride (Si3N4) matrix to obtain continuous fiber SiC–Si3N4 ceramic composites with optimal fracture toughness. Using the variable fidelity material design tool in application to two test problems, a reduction in design cycle times of between 40% and 80% is achieved as compared to using a conventional design optimization approach that exclusively calls the high-fidelity FEM. The optimal design obtained using the variable fidelity approach is the same as that obtained using the conventional procedure. The variable fidelity material design tool is extensible to multiscale multiphase material design by using MD based material performance analyses as the high-fidelity analyses in order to guide low-fidelity continuum level numerical tools such as the FEM or finite-difference method with significant savings in the computational time.


Author(s):  
Matthew A. Prior ◽  
Ian C. Stults ◽  
Matthew J. Daskilewicz ◽  
Scott J. Duncan ◽  
Brian J. German ◽  
...  

The demand for greater efficiency, lower emissions, and higher reliability in combined cycle power plants has driven industry to use higher-fidelity plant component models in conceptual design. Normally used later in preliminary component design, physics-based models can also be used in conceptual design as the building blocks of a plant-level modeling and simulation (M&S) environment. Although better designs can be discovered using such environments, the linking of multiple high-fidelity models can create intractably large design variable sets, long overall execution times, and model convergence limitations. As a result, an M&S environment comprising multiple linked high-fidelity models can be prohibitively large and/or slow to evaluate, discouraging design optimization and design space exploration. This paper describes a design space exploration methodology that addresses the aforementioned challenges. Specifically, the proposed methodology includes techniques for the reduction of total model run-time, reduction of design space dimensionality, effect visualization, and identification of Pareto-optimal power plant designs. An overview of the methodology’s main steps is given, leading to a description of the benefit and implementation of each step. Major steps in the process include design variable screening, efficient design space sampling, and surrogate modeling, all of which can be used as precursors to traditional optimization techniques. As an alternative to optimization, a Monte Carlo based method for design space exploration is explained conceptually. Selected steps from the methodology are applied to a fictional but representative example problem of combined cycle power plant design. The objective is to minimize cost of electricity (COE), subject to constraints on base load power and acquisition cost. This example problem is used to show relative run-time savings from using the methodology’s techniques compared to the alternative of performing optimization without them. The example additionally provides a context for explaining design space visualization techniques that are part of the methodology.


Author(s):  
Yong Hoon Lee ◽  
R. E. Corman ◽  
Randy H. Ewoldt ◽  
James T. Allison

A novel multiobjective adaptive surrogate modeling-based optimization (MO-ASMO) framework is proposed to utilize a minimal number of training samples efficiently for sequential model updates. All the sample points are enforced to be feasible, and to provide coverage of sparsely explored sparse design regions using a new optimization subproblem. The MO-ASMO method only evaluates high-fidelity functions at feasible sample points. During an exploitation sample phase, samples are selected to enhance solution accuracy rather than the global exploration. Sampling tasks are especially challenging for multiobjective optimization; for an n-dimensional design space, a strategy is required for generating model update sample points near an (n − 1)-dimensional hypersurface corresponding to the Pareto set in the design space. This is addressed here using a force-directed layout algorithm, adapted from graph visualization strategies, to distribute feasible sample points evenly near the estimated Pareto set. Model validation samples are chosen uniformly on the Pareto set hypersurface, and surrogate model estimates at these points are compared to high-fidelity model responses. All high-fidelity model evaluations are stored for later use to train an updated surrogate model. The MO-ASMO algorithm, along with the set of new sampling strategies, are tested using two mathematical and one realistic engineering problems. The second mathematical test problems is specifically designed to test the limits of this algorithm to cope with very narrow, non-convex feasible domains. It involves oscillatory objective functions, giving rise to a discontinuous set of Pareto-optimal solutions. Also, the third test problem demonstrates that the MO-ASMO algorithm can handle a practical engineering problem with more than 10 design variables and black-box simulations. The efficiency of the MO-ASMO algorithm is demonstrated by comparing the result of two mathematical problems to the results of the NSGA-II algorithm in terms of the number of high fidelity function evaluations, and is shown to reduce total function evaluations by several orders of magnitude when converging to the same Pareto sets.


Author(s):  
Joel Guerrero ◽  
Luca Mantelli ◽  
Sahrish B. Naqvi

In this manuscript, an automated framework dedicated to design space exploration and design optimization studies is presented. The framework integrates a set of numerical simulation, computer-aided design, numerical optimization, and data analytics tools using scripting capabilities. The tools used are open-source and freeware, and can be deployed on any platform. The main feature of the proposed methodology is the use of a cloud-based parametrical computer-aided design application, which allows the user to change any parametric variable defined in the solid model. We demonstrate the capabilities and flexibility of the framework using computational fluid dynamics applications; however, the same workflow can be used with any numerical simulation tool (e.g., a structural solver or a spread-sheet) that is able to interact via a command line interface or using scripting languages. We conduct design space exploration and design optimization studies using quantitative and qualitative metrics, and to reduce the high computing times and computational resources intrinsic to these kinds of studies, concurrent simulations and surrogate-based optimization are used.


Fluids ◽  
2020 ◽  
Vol 5 (1) ◽  
pp. 36 ◽  
Author(s):  
Joel Guerrero ◽  
Luca Mantelli ◽  
Sahrish B. Naqvi

In this manuscript, an automated framework dedicated to design space exploration and design optimization studies is presented. The framework integrates a set of numerical simulation, computer-aided design, numerical optimization, and data analytics tools using scripting capabilities. The tools used are open-source and freeware, and can be deployed on any platform. The main feature of the proposed methodology is the use of a cloud-based parametrical computer-aided design application, which allows the user to change any parametric variable defined in the solid model. We demonstrate the capabilities and flexibility of the framework using computational fluid dynamics applications; however, the same workflow can be used with any numerical simulation tool (e.g., a structural solver or a spread-sheet) that is able to interact via a command-line interface or using scripting languages. We conduct design space exploration and design optimization studies using quantitative and qualitative metrics, and, to reduce the high computing times and computational resources intrinsic to these kinds of studies, concurrent simulations and surrogate-based optimization are used.


Sign in / Sign up

Export Citation Format

Share Document