scholarly journals Theory of Functional Connections Applied to Linear ODEs Subject to Integral Constraints and Linear Ordinary Integro-Differential Equations

2021 ◽  
Vol 26 (3) ◽  
pp. 65
Author(s):  
Mario De De Florio ◽  
Enrico Schiassi ◽  
Andrea D’Ambrosio ◽  
Daniele Mortari ◽  
Roberto Furfaro

This study shows how the Theory of Functional Connections (TFC) allows us to obtain fast and highly accurate solutions to linear ODEs involving integrals. Integrals can be constraints and/or terms of the differential equations (e.g., ordinary integro-differential equations). This study first summarizes TFC, a mathematical procedure to obtain constrained expressions. These are functionals representing all functions satisfying a set of linear constraints. These functionals contain a free function, g(x), representing the unknown function to optimize. Two numerical approaches are shown to numerically estimate g(x). The first models g(x) as a linear combination of a set of basis functions, such as Chebyshev or Legendre orthogonal polynomials, while the second models g(x) as a neural network. Meaningful problems are provided. In all numerical problems, the proposed method produces very fast and accurate solutions.

2020 ◽  
Vol 2 (1) ◽  
pp. 37-55 ◽  
Author(s):  
Carl Leake ◽  
Daniele Mortari

This article presents a new methodology called Deep Theory of Functional Connections (TFC) that estimates the solutions of partial differential equations (PDEs) by combining neural networks with the TFC. The TFC is used to transform PDEs into unconstrained optimization problems by analytically embedding the PDE’s constraints into a “constrained expression” containing a free function. In this research, the free function is chosen to be a neural network, which is used to solve the now unconstrained optimization problem. This optimization problem consists of minimizing a loss function that is chosen to be the square of the residuals of the PDE. The neural network is trained in an unsupervised manner to minimize this loss function. This methodology has two major differences when compared with popular methods used to estimate the solutions of PDEs. First, this methodology does not need to discretize the domain into a grid, rather, this methodology can randomly sample points from the domain during the training phase. Second, after training, this methodology produces an accurate analytical approximation of the solution throughout the entire training domain. Because the methodology produces an analytical solution, it is straightforward to obtain the solution at any point within the domain and to perform further manipulation if needed, such as differentiation. In contrast, other popular methods require extra numerical techniques if the estimated solution is desired at points that do not lie on the discretized grid, or if further manipulation to the estimated solution must be performed.


Mathematics ◽  
2020 ◽  
Vol 8 (8) ◽  
pp. 1303 ◽  
Author(s):  
Carl Leake ◽  
Hunter Johnston ◽  
Daniele Mortari

This article presents a reformulation of the Theory of Functional Connections: a general methodology for functional interpolation that can embed a set of user-specified linear constraints. The reformulation presented in this paper exploits the underlying functional structure presented in the seminal paper on the Theory of Functional Connections to ease the derivation of these interpolating functionals—called constrained expressions—and provides rigorous terminology that lends itself to straightforward derivations of mathematical proofs regarding the properties of these constrained expressions. Furthermore, the extension of the technique to and proofs in n-dimensions is immediate through a recursive application of the univariate formulation. In all, the results of this reformulation are compared to prior work to highlight the novelty and mathematical convenience of using this approach. Finally, the methodology presented in this paper is applied to two partial differential equations with different boundary conditions, and, when data is available, the results are compared to state-of-the-art methods.


Mathematics ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 397 ◽  
Author(s):  
Hunter Johnston ◽  
Carl Leake ◽  
Daniele Mortari

This paper shows how to obtain highly accurate solutions of eighth-order boundary-value problems of linear and nonlinear ordinary differential equations. The presented method is based on the Theory of Functional Connections, and is solved in two steps. First, the Theory of Functional Connections analytically embeds the differential equation constraints into a candidate function (called a constrained expression) containing a function that the user is free to choose. This expression always satisfies the constraints, no matter what the free function is. Second, the free-function is expanded as a linear combination of orthogonal basis functions with unknown coefficients. The constrained expression (and its derivatives) are then substituted into the eighth-order differential equation, transforming the problem into an unconstrained optimization problem where the coefficients in the linear combination of orthogonal basis functions are the optimization parameters. These parameters are then found by linear/nonlinear least-squares. The solution obtained from this method is a highly accurate analytical approximation of the true solution. Comparisons with alternative methods appearing in literature validate the proposed approach.


2021 ◽  
Vol 5 (4) ◽  
pp. 219
Author(s):  
Somayeh Nemati ◽  
Pedro M. Lima ◽  
Delfim F. M. Torres

We introduce a new numerical method, based on Bernoulli polynomials, for solving multiterm variable-order fractional differential equations. The variable-order fractional derivative was considered in the Caputo sense, while the Riemann–Liouville integral operator was used to give approximations for the unknown function and its variable-order derivatives. An operational matrix of variable-order fractional integration was introduced for the Bernoulli functions. By assuming that the solution of the problem is sufficiently smooth, we approximated a given order of its derivative using Bernoulli polynomials. Then, we used the introduced operational matrix to find some approximations for the unknown function and its derivatives. Using these approximations and some collocation points, the problem was reduced to the solution of a system of nonlinear algebraic equations. An error estimate is given for the approximate solution obtained by the proposed method. Finally, five illustrative examples were considered to demonstrate the applicability and high accuracy of the proposed technique, comparing our results with the ones obtained by existing methods in the literature and making clear the novelty of the work. The numerical results showed that the new method is efficient, giving high-accuracy approximate solutions even with a small number of basis functions and when the solution to the problem is not infinitely differentiable, providing better results and a smaller number of basis functions when compared to state-of-the-art methods.


2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
Haidong Qu ◽  
Xuan Liu

We present a new method for solving the fractional differential equations of initial value problems by using neural networks which are constructed from cosine basis functions with adjustable parameters. By training the neural networks repeatedly the numerical solutions for the fractional differential equations were obtained. Moreover, the technique is still applicable for the coupled differential equations of fractional order. The computer graphics and numerical solutions show that the proposed method is very effective.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Idris Kharroubi ◽  
Thomas Lim ◽  
Xavier Warin

AbstractWe study the approximation of backward stochastic differential equations (BSDEs for short) with a constraint on the gains process. We first discretize the constraint by applying a so-called facelift operator at times of a grid. We show that this discretely constrained BSDE converges to the continuously constrained one as the mesh grid converges to zero. We then focus on the approximation of the discretely constrained BSDE. For that we adopt a machine learning approach. We show that the facelift can be approximated by an optimization problem over a class of neural networks under constraints on the neural network and its derivative. We then derive an algorithm converging to the discretely constrained BSDE as the number of neurons goes to infinity. We end by numerical experiments.


2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
N. H. Sweilam ◽  
M. M. Khader ◽  
W. Y. Kota

A numerical method for solving fourth-order integro-differential equations is presented. This method is based on replacement of the unknown function by a truncated series of well-known shifted Chebyshev expansion of functions. An approximate formula of the integer derivative is introduced. The introduced method converts the proposed equation by means of collocation points to system of algebraic equations with shifted Chebyshev coefficients. Thus, by solving this system of equations, the shifted Chebyshev coefficients are obtained. Special attention is given to study the convergence analysis and derive an upper bound of the error of the presented approximate formula. Numerical results are performed in order to illustrate the usefulness and show the efficiency and the accuracy of the present work.


Sign in / Sign up

Export Citation Format

Share Document