Robust Bayesian Sequential Input Shaping for Optimal Li-Ion Battery Model Parameter Identifiability

Author(s):  
Michael J. Rothenberger ◽  
Hosam K. Fathy

This paper examines the challenge of shaping a battery’s input trajectory to (i) maximize its Fisher parameter identifiability while (ii) achieving robustness to parameter uncertainties. The paper is motivated by earlier research showing that the speed and accuracy with which battery parameters can be estimated both improve significantly when battery inputs are optimized for Fisher identifiability. Previous research performs this trajectory optimization for a known nominal parameter set. This creates a tautology where accurate parameter identification is a prerequisite for Fisher identifiability optimization. In contrast, this paper presents an iterative scheme that: (i) uses prior parameter probability distributions to create a weighted Fisher metric; (ii) optimizes the battery input trajectory for this metric using a genetic algorithm; (iii) applies the resulting input trajectory to the battery; (iv) estimates battery parameters using a Bayesian particle filter; (v) re-computes the weighted Fisher information metric using the resulting posterior parameter distribution; and (vi) repeats this process until convergence. This approach builds on well-established ideas from the estimation literature, and applies them to the battery domain for the first time. Simulation studies highlight the ability of this iterative algorithm to converge quickly towards the correct battery parameter values, despite large initial parameter uncertainties.

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 853
Author(s):  
Philipp Frank ◽  
Reimar Leike ◽  
Torsten A. Enßlin

Efficiently accessing the information contained in non-linear and high dimensional probability distributions remains a core challenge in modern statistics. Traditionally, estimators that go beyond point estimates are either categorized as Variational Inference (VI) or Markov-Chain Monte-Carlo (MCMC) techniques. While MCMC methods that utilize the geometric properties of continuous probability distributions to increase their efficiency have been proposed, VI methods rarely use the geometry. This work aims to fill this gap and proposes geometric Variational Inference (geoVI), a method based on Riemannian geometry and the Fisher information metric. It is used to construct a coordinate transformation that relates the Riemannian manifold associated with the metric to Euclidean space. The distribution, expressed in the coordinate system induced by the transformation, takes a particularly simple form that allows for an accurate variational approximation by a normal distribution. Furthermore, the algorithmic structure allows for an efficient implementation of geoVI which is demonstrated on multiple examples, ranging from low-dimensional illustrative ones to non-linear, hierarchical Bayesian inverse problems in thousands of dimensions.


2010 ◽  
Vol 58 (1) ◽  
pp. 183-195 ◽  
Author(s):  
S. Amari ◽  
A. Cichocki

Information geometry of divergence functionsMeasures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, Kullback-Leibler divergence andf-divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class off-divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. Thef-divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class off-divergences. This is unique, sitting at the intersection of thef-divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallisq-entropy and related divergences are also addressed.


Holzforschung ◽  
2020 ◽  
Vol 74 (11) ◽  
pp. 1011-1020
Author(s):  
Danyang Tong ◽  
Susan Alexis Brown ◽  
David Corr ◽  
Gianluca Cusatis

AbstractRising global emission have led to a renewed popularity of timber in building design, including timber-concrete tall buildings up to 18 stories. In spite of this surge in wood construction, there remains a gap in understanding of long-term structural behavior, particularly wood creep. Unlike concrete, code prescriptions for wood design are lacking in robust estimates for structural shortening. Models for wood creep have become increasingly necessary due to the potential for unforeseen shortening, especially with respect to differential shortening. These effects can have serious impacts as timber building heights continue to grow. This study lays the groundwork for wood compliance prediction models for use in timber design. A thorough review of wood creep studies was conducted and viable experimental results were compiled into a database. Studies were chosen based on correlation of experimental conditions with a realistic building environment. An unbiased parameter identification method, originally applied to concrete prediction models, was used to fit multiple compliance functions to each data curve. Based on individual curve fittings, statistical analysis was performed to determine the best fit function and average parameter values for the collective database. A power law trend in wood creep, with lognormal parameter distribution, was confirmed by the results.


Symmetry ◽  
2018 ◽  
Vol 10 (7) ◽  
pp. 286
Author(s):  
V. García ◽  
M. Martel-Escobar ◽  
F. Vázquez-Polo

This paper describes a complementary tool for fitting probabilistic distributions in data analysis. First, we examine the well known bivariate index of skewness and the aggregate skewness function, and then introduce orderings of the skewness of probability distributions. Using an example, we highlight the advantages of this approach and then present results for these orderings in common uniparametric families of continuous distributions, showing that the orderings are well suited to the intuitive conception of skewness and, moreover, that the skewness can be controlled via the parameter values.


2020 ◽  
Vol 1 (4) ◽  
pp. 229-238
Author(s):  
Devi Munandar ◽  
Sudradjat Supian ◽  
Subiyanto Subiyanto

The influence of social media in disseminating information, especially during the COVID-19 pandemic, can be observed with time interval, so that the probability of number of tweets discussed by netizens on social media can be observed. The nonhomogeneous Poisson process (NHPP) is a Poisson process dependent on time parameters and the exponential distribution having unequal parameter values and, independently of each other. The probability of no occurrence an event in the initial state is one and the probability of an event in initial state is zero. Using of non-homogeneous Poisson in this paper aims to predict and count the number of tweet posts with the keyword coronavirus, COVID-19 with set time intervals every day. Posting of tweets from one time each day to the next do not affect each other and the number of tweets is not the same. The dataset used in this study is crawling of COVID-19 tweets three times a day with duration of 20 minutes each crawled for 13 days or 39 time intervals. The result of this study obtained predictions and calculated for the probability of the number of tweets for the tendency of netizens to post on the situation of the COVID-19 pandemic.


2015 ◽  
Vol 3 (1) ◽  
pp. 1
Author(s):  
Niklas Andersson ◽  
Per-Ola Larsson ◽  
Johan Åkesson ◽  
Niclas Carlsson ◽  
Staffan Skålén ◽  
...  

A polyethylene plant at Borealis AB is modelled in the Modelica language and considered for parameter estimations at grade transitions. Parameters have been estimated for both the steady-state and the dynamic case using the JModelica.org platform, which offers tools for steady-state parameter estimation and supports simulation with parameter sensitivies. The model contains 31 candidate parameters, giving a huge amount of possible parameter combinations. The best parameter sets have been chosen using a parameter-selection algorithm that identified parameter sets with poor numerical properties. The parameter-selection algorithm reduces the number of parameter sets that is necessary to explore. The steady-state differs from the dynamic case with respect to parameter selection. Validations of the parameter estimations in the dynamic case show a significant reduction in an objective value used to evaluate the quality of the solution from that of the nominal reference, where the nominal parameter values are used.


Aerospace ◽  
2020 ◽  
Vol 7 (10) ◽  
pp. 144 ◽  
Author(s):  
Martin Lindner ◽  
Judith Rosenow ◽  
Thomas Zeh ◽  
Hartmut Fricke

Today, each flight is filed as a static route not later than one hour before departure. From there on, changes of the lateral route initiated by the pilot are only possible with air traffic control clearance and in the minority. Thus, the initially optimized trajectory of the flight plan is flown, although the optimization may already be based upon outdated weather data at take-off. Global weather data as those modeled by the Global Forecast System do, however, contain hints on forecast uncertainties itself, which is quantified by considering so-called ensemble forecast data. In this study, the variability in these weather parameter uncertainties is analyzed, before the trajectory optimization model TOMATO is applied to single trajectories considering the previously quantified uncertainties. TOMATO generates, based on the set of input data as provided by the ensembles, a 3D corridor encasing all resulting optimized trajectories. Assuming that this corridor is filed in addition to the initial flight plan, the optimum trajectory can be updated even during flight, as soon as updated weather forecasts are available. In return and as a compromise, flights would have to stay within the corridor to provide planning stability for Air Traffic Management compared to full free in-flight optimization. Although the corridor restricts the re-optimized trajectory, fuel savings of up to 1.1%, compared to the initially filed flight, could be shown.


Author(s):  
Anne L. Marsan ◽  
Yifan Chen ◽  
Paul Stewart

Abstract Direct surface manipulation (DSM) allows a designer to add a raised or indented feature to an existing NURBS or finite element surface. The user bounds the feature with a closed curve, and indicates an influence center that represents either the highest or lowest area of the feature. As we move radially outward from the influence center to the boundary curve, the magnitude of displacement is scaled gradually by a 1D parametric cubic basis function whose values range from 0 to 1. In this paper we present a new technique for assigning parameter values in the radial direction, i.e. u, to points within a DSM feature. The new technique poses parameter distribution as a Dirichlet problem and uses a finite element method to solve for u(x,y). The new method overcomes some stringent geometric conditions inherited from a fundamentally geometric-based reparameterization scheme and allows us to work with non-star-shaped and multiply-connected DSM features. Thus it allows us to apply this surface feature technique to a wider variety of surface applications.


Author(s):  
Yanjun Zhang ◽  
Tingting Xia ◽  
Mian Li

Abstract Various types of uncertainties, such as parameter uncertainty, model uncertainty, metamodeling uncertainty may lead to low robustness. Parameter uncertainty can be either epistemic or aleatory in physical systems, which have been widely represented by intervals and probability distributions respectively. Model uncertainty is formally defined as the difference between the true value of the real-world process and the code output of the simulation model at the same value of inputs. Additionally, metamodeling uncertainty is introduced due to the usage of metamodels. To reduce the effects of uncertainties, robust optimization (RO) algorithms have been developed to obtain solutions being not only optimal but also less sensitive to uncertainties. Based on how parameter uncertainty is modeled, there are two categories of RO approaches: interval-based and probability-based. In real-world engineering problems, both interval and probabilistic parameter uncertainties are likely to exist simultaneously in a single problem. However, few works have considered mixed interval and probabilistic parameter uncertainties together with other types of uncertainties. In this work, a general RO framework is proposed to deal with mixed interval and probabilistic parameter uncertainties, model uncertainty, and metamodeling uncertainty simultaneously in design optimization problems using the intervals-of-statistics approaches. The consideration of multiple types of uncertainties will improve the robustness of optimal designs and reduce the risk of inappropriate decision-making, low robustness and low reliability in engineering design. Two test examples are utilized to demonstrate the applicability and effectiveness of the proposed RO approach.


2013 ◽  
Vol 6 (1) ◽  
pp. 585-623 ◽  
Author(s):  
D. D. Lucas ◽  
R. Klein ◽  
J. Tannahill ◽  
D. Ivanova ◽  
S. Brandon ◽  
...  

Abstract. Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.


Sign in / Sign up

Export Citation Format

Share Document