scholarly journals High Performance Computing for the Reduced Basis Method. Application to Natural Convection

2013 ◽  
Vol 43 ◽  
pp. 255-273 ◽  
Author(s):  
E. Schenone ◽  
S. Veys ◽  
C. Prud’homme
Author(s):  
G. Rozza ◽  
C. N. Nguyen ◽  
A. T. Patera ◽  
S. Deparis

This paper focuses on the parametric study of steady and unsteady forced and natural convection problems by the certified reduced basis method. These problems are characterized by an input-output relationship in which given an input parameter vector — material properties, boundary conditions and sources, and geometry — we would like to compute certain outputs of engineering interest — heat fluxes and average temperatures. The certified reduced basis method provides both (i) a very inexpensive yet accurate output prediction, and (ii) a rigorous bound for the error in the reduced basis prediction relative to an underlying expensive high-fidelity finite element discretization. The feasibility and efficiency of the method is demonstrated for three natural convection model problems: a scalar steady forced convection problem in a rectangular channel is characterized by two parameters — Pe´clet number and the aspect ratio of the channel — and an output — the average temperature over the domain; a steady natural convection problem in a laterally heated cavity is characterized by three parameters — Grashof and Prandtl numbers, and the aspect ratio of the cavity — and an output — the inverse of the Nusselt number; and an unsteady natural convection problem in a laterally heated cavity is characterized by two parameters — Grashof and Prandtl numbers — and a time-dependent output — the average of the horizontal velocity over a specified area of the cavity.


MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


Sign in / Sign up

Export Citation Format

Share Document