scholarly journals High-Performance Integrated hydrodynamic Modelling of Storm Induced Floods at a Catchment Scale

10.29007/tt1x ◽  
2018 ◽  
Author(s):  
Xilin Xia ◽  
Qiuhua Liang ◽  
Xiaodong Ming

Flooding is one of the most common types of natural hazards. The current practice of large-scale fluvial flood modelling relies on the use of hydrological models to predict upstream discharge hydrograph to drive inundation modelling at downstream. However, the oversimplified representation of both catchment topographies and hydraulics make hydrological models heavily rely on model parameterisation and calibration. This makes the modelling strategy unsuitable for prediction of extreme events featured with highly transient hydraulic processes, for which high-quality hydrological data is commonly missing to support model parameterisation and calibration. In this paper, the High-Performance Integrated hydrodynamic Modelling System (HiPIMS) has been adapted and applied to the whole 2500 km2 Eden catchment in the UK to reproduce the flood event caused by Storm Desmond in December, 2015. Without necessity of intensive calibration and using hydrographs as boundary conditions, satisfactory results have been obtained for both inundation extent and water level time series in channels, in comparison with observations. Accelerated by multiple modern graphic processing units (GPUs), the model runs more than 20 times faster than real time for the simulation of the whole catchment at 20m resolution. The results successfully demonstrate HiPIMS as a promising tool for real-time flood forecasting and flood risk assessment.

2018 ◽  
Vol 7 (12) ◽  
pp. 467 ◽  
Author(s):  
Mengyu Ma ◽  
Ye Wu ◽  
Wenze Luo ◽  
Luo Chen ◽  
Jun Li ◽  
...  

Buffer analysis, a fundamental function in a geographic information system (GIS), identifies areas by the surrounding geographic features within a given distance. Real-time buffer analysis for large-scale spatial data remains a challenging problem since the computational scales of conventional data-oriented methods expand rapidly with increasing data volume. In this paper, we introduce HiBuffer, a visualization-oriented model for real-time buffer analysis. An efficient buffer generation method is proposed which introduces spatial indexes and a corresponding query strategy. Buffer results are organized into a tile-pyramid structure to enable stepless zooming. Moreover, a fully optimized hybrid parallel processing architecture is proposed for the real-time buffer analysis of large-scale spatial data. Experiments using real-world datasets show that our approach can reduce computation time by up to several orders of magnitude while preserving superior visualization effects. Additional experiments were conducted to analyze the influence of spatial data density, buffer radius, and request rate on HiBuffer performance, and the results demonstrate the adaptability and stability of HiBuffer. The parallel scalability of HiBuffer was also tested, showing that HiBuffer achieves high performance of parallel acceleration. Experimental results verify that HiBuffer is capable of handling 10-million-scale data.


2020 ◽  
Author(s):  
Markus Wiedemann ◽  
Bernhard S.A. Schuberth ◽  
Lorenzo Colli ◽  
Hans-Peter Bunge ◽  
Dieter Kranzlmüller

<p>Precise knowledge of the forces acting at the base of tectonic plates is of fundamental importance, but models of mantle dynamics are still often qualitative in nature to date. One particular problem is that we cannot access the deep interior of our planet and can therefore not make direct in situ measurements of the relevant physical parameters. Fortunately, modern software and powerful high-performance computing infrastructures allow us to generate complex three-dimensional models of the time evolution of mantle flow through large-scale numerical simulations.</p><p>In this project, we aim at visualizing the resulting convective patterns that occur thousands of kilometres below our feet and to make them "accessible" using high-end virtual reality techniques.</p><p>Models with several hundred million grid cells are nowadays possible using the modern supercomputing facilities, such as those available at the Leibniz Supercomputing Centre. These models provide quantitative estimates on the inaccessible parameters, such as buoyancy and temperature, as well as predictions of the associated gravity field and seismic wavefield that can be tested against Earth observations.</p><p>3-D visualizations of the computed physical parameters allow us to inspect the models such as if one were actually travelling down into the Earth. This way, convective processes that occur thousands of kilometres below our feet are virtually accessible by combining the simulations with high-end VR techniques.</p><p>The large data set used here poses severe challenges for real time visualization, because it cannot fit into graphics memory, while requiring rendering with strict deadlines. This raises the necessity to balance the amount of displayed data versus the time needed for rendering it.</p><p>As a solution, we introduce a rendering framework and describe our workflow that allows us to visualize this geoscientific dataset. Our example exceeds 16 TByte in size, which is beyond the capabilities of most visualization tools. To display this dataset in real-time, we reduce and declutter the dataset through isosurfacing and mesh optimization techniques.</p><p>Our rendering framework relies on multithreading and data decoupling mechanisms that allow to upload data to graphics memory while maintaining high frame rates. The final visualization application can be executed in a CAVE installation as well as on head mounted displays such as the HTC Vive or Oculus Rift. The latter devices will allow for viewing our example on-site at the EGU conference.</p>


2017 ◽  
Author(s):  
Jannis M. Hoch ◽  
Jeffrey C. Neal ◽  
Fedor Baart ◽  
Rens van Beek ◽  
Hessel C. Winsemius ◽  
...  

Abstract. To increase the representation of physical processes in inundation modelling, current research approaches aim to integrate both hydrological and hydrodynamic models. A previous study by Hoch et al. (2017) showed that spatially explicit coupling approaches can outperform stand-alone runs by single-purpose models as they combine spatially distributed model forcing by hydrological models with more sophisticated routing schemes in hydrodynamic models. We here present GLOFRIM, a globally applicable computational framework for integrated hydrological-hydrodynamic modelling, to facilitate such coupling approaches and to cater for an ensemble of models to be coupled. It currently allows for coupling the global hydrological model PCR-GLOBWB with either Delft3D Flexible Mesh (DFM), solving the full shallow-water equations and allowing for spatially flexible meshing, or LISFLOOD-FP (LFP), solving the local inertia equations and running on regular grids. The main advantages of the framework are its open and free access, its global applicability, its versatility, and its extensibility with other hydrological or hydrodynamic models. Before applying GLOFRIM to an actual test case, we benchmarked both DFM and LFP for a synthetic test case. Results show that for sub-critical flow conditions, discharge response to the same input signal is near identical for both models, which agrees with previous studies. We subsequently applied the framework to the Amazon River basin to test the framework thoroughly and, in addition, to perform a first-ever benchmark of flexible and regular grids at the large-scale. Both DFM and LFP produce comparable results in terms of simulated discharge with LFP exhibiting slightly higher accuracy as expressed by a Kling-Gupta-Efficiency of 0.82 compared to 0.76 for DFM. However, benchmarking inundation extent between DFM and LFP over the entire study area, a critical success index of 0.46 was obtained, indicating that the models disagree as often as they agree. Differences between models in both simulated discharge and inundation extent is to a large extent attributable to the gridding techniques employed. In fact, the result show that the numerical scheme of the inundation model and the gridding technique can contribute as strongly to deviations in simulated inundation extent as, unlike the global flood model inter-comparison by Trigg et al. (2016), we control for model forcing and boundary conditions. This study shows that the presented computational framework is robust and widely applicable. GLOFRIM is designed as open access and to be easily extendable, and thus we hope that other large-scale hydrological and hydrodynamic models will be added, eventually capturing more locally relevant processes as well as allowing for more robust model inter-comparison, benchmarking, and ensemble simulations of flood hazard at the large scale.


2005 ◽  
Vol 44 (02) ◽  
pp. 211-214 ◽  
Author(s):  
T. Tweed ◽  
S. Miguet ◽  
K. Hassan

Summary Objectives: Hospitals and medical centers are producing more and more data that need to be processed. Those data are confidential, heterogeneous, and limited to the geographic site where they have been produced. Unless properly anonymized, they cannot be distributed on wide area networks. Methods: Grid technologies allow the globalization of storage and processing resources, and enable large-scale experimentations on distributed data. They constitute a promising tool to treat the different data and analyze the knowledge they contain, while offering secured access and high-performance computing capacities to the different users. Our aim is to evaluate the possibilities of grid technologies for handling medical data. Results and Conclusions: In this paper, we focus on a breast cancer diagnosis assistance tool, based on distributed and incremental knowledge construction and a content-based image retrieval system. We analyze the different scenarios of uses of such a tool. We further propose an algorithm that indexes mammographic images for content-based query purposes. This algorithm is tested on images of different resolutions in order to reduce the indexation time and we analyze its performance with experiments on the grid.


2019 ◽  
Vol 271 ◽  
pp. 06007
Author(s):  
Millard McElwee ◽  
Bingyu Zhao ◽  
Kenichi Soga

The primary focus of this research is to develop and implement an agent-based model (ABM) to analyze the New Orleans Metropolitan transportation network near real-time. ABMs have grown in popularity because of their ability to analyze multifaceted community scale resilience with hundreds of thousands of links and millions of agents. Road closures and reduction in capacities are examples of influences on the weights or removal of edges which can affect the travel time, speed, and route of agents in the transportation model. Recent advances in high-performance computing (HPC) have made modeling networks on the city scale much less computationally intensive. We introduce an open-source ABM which utilizes parallel distributed computing to enable faster convergence to large scale problems. We simulate 50,000 agents on the entire southeastern Louisiana road network and part of Mississippi as well. This demonstrates the capability to simulate both city and regional scale transportation networks near real time.


2019 ◽  
Vol 5 (3) ◽  
pp. eaav6019 ◽  
Author(s):  
Abouzar Kaboudian ◽  
Elizabeth M. Cherry ◽  
Flavio H. Fenton

Cardiac dynamics modeling has been useful for studying and treating arrhythmias. However, it is a multiscale problem requiring the solution of billions of differential equations describing the complex electrophysiology of interconnected cells. Therefore, large-scale cardiac modeling has been limited to groups with access to supercomputers and clusters. Many areas of computational science face similar problems where computational costs are too high for personal computers so that supercomputers or clusters currently are necessary. Here, we introduce a new approach that makes high-performance simulation of cardiac dynamics and other large-scale systems like fluid flow and crystal growth accessible to virtually anyone with a modest computer. For cardiac dynamics, this approach will allow not only scientists and students but also physicians to use physiologically accurate modeling and simulation tools that are interactive in real time, thereby making diagnostics, research, and education available to a broader audience and pushing the boundaries of cardiac science.


2010 ◽  
Vol 2010 ◽  
pp. 1-5 ◽  
Author(s):  
A. Pomarico ◽  
A. Morea ◽  
P. Flora ◽  
G. Roselli ◽  
E. Lasalandra

MEMS resonators are today widely investigated as a desirable alternative to quartz resonators in real-time clock applications, because of their low-cost, integration capability properties. Nevertheless, MEMS resonators performances are still not competitive, especially in terms of frequency stability and device equivalent resistance (and, then, power consumption). We propose a new structure for a MEMS resonator, with a vertical-like transduction mechanism, which exhibits promising features. The vertical resonator can be fabricated with the low-cost, high performance THELMA technology, and it is designed to be efficiently frequency tunable. With respect to the commonly investigated lateral resonators, it is expected to have lower equivalent resistances and improved large-scale repeatability characteristics.


2013 ◽  
Vol 45 (1) ◽  
pp. 148-164 ◽  
Author(s):  
Flemming Finsen ◽  
Christian Milzow ◽  
Richard Smith ◽  
Philippa Berry ◽  
Peter Bauer-Gottwein

Measurements of river and lake water levels from space-borne radar altimeters (past missions include ERS, Envisat, Jason, Topex) are useful for calibration and validation of large-scale hydrological models in poorly gauged river basins. Altimetry data availability over the downstream reaches of the Brahmaputra is excellent (17 high-quality virtual stations from ERS-2, 6 from Topex and 10 from Envisat are available for the Brahmaputra). In this study, altimetry data are used to update a large-scale Budyko-type hydrological model of the Brahmaputra river basin in real time. Altimetry measurements are converted to discharge using rating curves of simulated discharge versus observed altimetry. This approach makes it possible to use altimetry data from river cross sections where both in-situ rating curves and accurate river cross section geometry are not available. Model updating based on radar altimetry improved model performance considerably. The Nash–Sutcliffe model efficiency increased from 0.77 to 0.83. Real-time river basin modelling using radar altimetry has the potential to improve the predictive capability of large-scale hydrological models elsewhere on the planet.


Sign in / Sign up

Export Citation Format

Share Document