RESERVOIR SIMULATION—UPSCALING, STREAMLINES AND PARALLEL COMPUTING

2007 ◽  
Vol 47 (1) ◽  
pp. 199
Author(s):  
M. Asadullah ◽  
P. Behrenbruch ◽  
S. Pham

Simulation of petroleum reservoirs is becoming more and more complex due to increasing necessity to model heterogeneity of reservoirs for accurate reservoir performance prediction. With high oil prices and less easy oil, accurate reservoir management tools such as simulation models are in more demand than ever before. The aim is to capture and preserve reservoir heterogeneity when changing over from a detailed geocellular model to a flow simulation model, minimising errors when upscaling and preventing excessive numerical dispersion by employing variable and innovative grids, as well as improved computational algorithms.For accurate and efficient simulation of large-scale models there are essentially three choices: upscaling, which involves averaging of parameters for several blocks, resulting in a coarser model that executes faster; the use of streamline simulation, which uses a more optimal grid, combined with a different computational algorithm for increased efficiency; and, the use of parallel computing techniques, which use superior hardware configurations for efficiency gains. With uncertainty screening of various multiple geostatistical realisations and investigation of alternative development scenarios— now commonplace for determining reservoir performance—computational efficiency and accuracy in modelling are paramount. This paper summarises the main techniques and methodologies involved in considering geocellular models for flow simulation of reservoirs, commenting on advantages and disadvantages among the various possibilities. Starting with some historic comments, the three modes of simulation are reviewed and examples are given for illustrative purposes, including a case history for the Bayu-Undan Field, Timor Sea.

Author(s):  
Sedki Akram

Groundwater management problems are typically of a large-scale nature, involving complex nonlinear objective functions and constraints, which are commonly evaluated through the use of numerical simulation models. Given these complexities, metaheuristic optimization algorithms have recently become popular choice for solving such complex problems which are difficult to solve by traditional methods. However, the practical applications of metaheuristics are severely challenged by the requirement of large number of function evaluations to achieve convergence. To overcome this shortcoming, many new metaheuristics and different variants of existing ones have been proposed in recent years. In this study, a recently developed algorithm called flower pollination algorithm (FPA) is investigated for optimal groundwater management. The FPA is improved, combined with the widely used groundwater flow simulation model MODFLOW, and applied to solve two groundwater management problems. The proposed algorithm, denoted as IFPA, is first tested on a hypothetical aquifer system, to minimize the total pumping to contain contaminated groundwater within a capture zone. IFPA is then applied to maximize the total annual pumping from existing wells in Rhis-Nekor unconfined coastal aquifer on the northern of Morocco. The obtained results indicate that IFPA is a promising method for solving groundwater management problems as it outperforms the standard FPA and other algorithms applied to the case studies considered, both in terms of convergence rate and solution quality.


Author(s):  
Kelly Cristinne Leite Angelim ◽  
Túlio Cavalcante ◽  
Jonathan Teixeira ◽  
Paulo Roberto Maciel Lyra ◽  
DARLAN KARLO ELISIÁRIO DE CARVALHO

2011 ◽  
Vol 34 (4) ◽  
pp. 717-728
Author(s):  
Zu-Ying LUO ◽  
Yin-He HAN ◽  
Guo-Xing ZHAO ◽  
Xian-Chuan YU ◽  
Ming-Quan ZHOU

Author(s):  
Stefano Vassanelli

Establishing direct communication with the brain through physical interfaces is a fundamental strategy to investigate brain function. Starting with the patch-clamp technique in the seventies, neuroscience has moved from detailed characterization of ionic channels to the analysis of single neurons and, more recently, microcircuits in brain neuronal networks. Development of new biohybrid probes with electrodes for recording and stimulating neurons in the living animal is a natural consequence of this trend. The recent introduction of optogenetic stimulation and advanced high-resolution large-scale electrical recording approaches demonstrates this need. Brain implants for real-time neurophysiology are also opening new avenues for neuroprosthetics to restore brain function after injury or in neurological disorders. This chapter provides an overview on existing and emergent neurophysiology technologies with particular focus on those intended to interface neuronal microcircuits in vivo. Chemical, electrical, and optogenetic-based interfaces are presented, with an analysis of advantages and disadvantages of the different technical approaches.


2003 ◽  
Vol 79 (1) ◽  
pp. 132-146 ◽  
Author(s):  
Dennis Yemshanov ◽  
Ajith H Perera

We reviewed the published knowledge on forest succession in the North American boreal biome for its applicability in modelling forest cover change over large extents. At broader scales, forest succession can be viewed as forest cover change over time. Quantitative case studies of forest succession in peer-reviewed literature are reliable sources of information about changes in forest canopy composition. We reviewed the following aspects of forest succession in literature: disturbances; pathways of post-disturbance forest cover change; timing of successional steps; probabilities of post-disturbance forest cover change, and effects of geographic location and ecological site conditions on forest cover change. The results from studies in the literature, which were mostly based on sample plot observations, appeared to be sufficient to describe boreal forest cover change as a generalized discrete-state transition process, with the discrete states denoted by tree species dominance. In this paper, we outline an approach for incorporating published knowledge on forest succession into stochastic simulation models of boreal forest cover change in a standardized manner. We found that the lack of details in the literature on long-term forest succession, particularly on the influence of pre-disturbance forest cover composition, may be limiting factors in parameterizing simulation models. We suggest that the simulation models based on published information can provide a good foundation as null models, which can be further calibrated as detailed quantitative information on forest cover change becomes available. Key words: probabilistic model, transition matrix, boreal biome, landscape ecology


Author(s):  
Clemens M. Lechner ◽  
Nivedita Bhaktha ◽  
Katharina Groskurth ◽  
Matthias Bluemke

AbstractMeasures of cognitive or socio-emotional skills from large-scale assessments surveys (LSAS) are often based on advanced statistical models and scoring techniques unfamiliar to applied researchers. Consequently, applied researchers working with data from LSAS may be uncertain about the assumptions and computational details of these statistical models and scoring techniques and about how to best incorporate the resulting skill measures in secondary analyses. The present paper is intended as a primer for applied researchers. After a brief introduction to the key properties of skill assessments, we give an overview over the three principal methods with which secondary analysts can incorporate skill measures from LSAS in their analyses: (1) as test scores (i.e., point estimates of individual ability), (2) through structural equation modeling (SEM), and (3) in the form of plausible values (PVs). We discuss the advantages and disadvantages of each method based on three criteria: fallibility (i.e., control for measurement error and unbiasedness), usability (i.e., ease of use in secondary analyses), and immutability (i.e., consistency of test scores, PVs, or measurement model parameters across different analyses and analysts). We show that although none of the methods are optimal under all criteria, methods that result in a single point estimate of each respondent’s ability (i.e., all types of “test scores”) are rarely optimal for research purposes. Instead, approaches that avoid or correct for measurement error—especially PV methodology—stand out as the method of choice. We conclude with practical recommendations for secondary analysts and data-producing organizations.


npj Vaccines ◽  
2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Nikolaos C. Kyriakidis ◽  
Andrés López-Cortés ◽  
Eduardo Vásconez González ◽  
Alejandra Barreto Grimaldos ◽  
Esteban Ortiz Prado

AbstractThe new SARS-CoV-2 virus is an RNA virus that belongs to the Coronaviridae family and causes COVID-19 disease. The newly sequenced virus appears to originate in China and rapidly spread throughout the world, becoming a pandemic that, until January 5th, 2021, has caused more than 1,866,000 deaths. Hence, laboratories worldwide are developing an effective vaccine against this disease, which will be essential to reduce morbidity and mortality. Currently, there more than 64 vaccine candidates, most of them aiming to induce neutralizing antibodies against the spike protein (S). These antibodies will prevent uptake through the human ACE-2 receptor, thereby limiting viral entrance. Different vaccine platforms are being used for vaccine development, each one presenting several advantages and disadvantages. Thus far, thirteen vaccine candidates are being tested in Phase 3 clinical trials; therefore, it is closer to receiving approval or authorization for large-scale immunizations.


Author(s):  
Mahdi Esmaily Moghadam ◽  
Yuri Bazilevs ◽  
Tain-Yen Hsia ◽  
Alison Marsden

A closed-loop lumped parameter network (LPN) coupled to a 3D domain is a powerful tool that can be used to model the global dynamics of the circulatory system. Coupling a 0D LPN to a 3D CFD domain is a numerically challenging problem, often associated with instabilities, extra computational cost, and loss of modularity. A computationally efficient finite element framework has been recently proposed that achieves numerical stability without sacrificing modularity [1]. This type of coupling introduces new challenges in the linear algebraic equation solver (LS), producing an strong coupling between flow and pressure that leads to an ill-conditioned tangent matrix. In this paper we exploit this strong coupling to obtain a novel and efficient algorithm for the linear solver (LS). We illustrate the efficiency of this method on several large-scale cardiovascular blood flow simulation problems.


Sign in / Sign up

Export Citation Format

Share Document