Feasible stock trajectories: a flexible and efficient sequential estimator for use in fisheries management procedures

2012 ◽  
Vol 69 (1) ◽  
pp. 161-177 ◽  
Author(s):  
Nokome Bentley ◽  
Adam D. Langley

We describe a sequential estimation approach designed to be used as part of a fisheries management procedure; it is computationally efficient and able to be applied to varying types, and extents, of data. The estimator maintains a pool of stock trajectories, each having a unique combination of model parameters (e.g., stock–recruitment steepness) sampled from prior probability distributions. Each year, for each trajectory, the values of variables (e.g., current biomass) are updated and tested against specified constraints. Constraints further determine the feasibility of the trajectories by defining likelihood functions for model variables, or combinations of variables, in particular years. Trajectories that fail to meet one or more of the constraints are discarded from the pool and replaced by new trajectories. Each year, stochastic forward projections of the trajectories in the pool are used to determine an optimal catch level. The flexibility and accuracy of the estimator is evaluated using the fishery for snapper, Pagrus auratus , off northern New Zealand as a case study. The sequential nature of the algorithm suggests alternative methods of presentation for understanding and explaining the fisheries estimation process. We provide recommendations for both the evaluation and operation of management procedures that employ the estimator.

2004 ◽  
Vol 61 (8) ◽  
pp. 1331-1343 ◽  
Author(s):  
Randall M. Peterman

Abstract The purpose of this paper is to review recent work on four key challenges in fisheries science and management: (1) dealing with pervasive uncertainties and risks; (2) estimating probabilities for uncertain quantities; (3) evaluating performance of proposed management actions; and (4) communicating technical issues. These challenges are exacerbated in fisheries that harvest multiple stocks, and various methods provide partial solutions to them: (i) risk assessments and decision analyses take uncertainties into account by permitting several alternative hypotheses to be considered at once. (ii) Hierarchical models applied to multi-stock data sets can improve estimates of probability distributions for model parameters compared with those derived through single-stock analyses. (iii) Operating models of complete fishery systems provide comprehensive platforms for testing management procedures. (iv) Finally, results from research in such other disciplines as cognitive psychology can facilitate better communication about uncertainties and risks among scientists, managers, and stakeholders.


Author(s):  
Marcello Pericoli ◽  
Marco Taboga

Abstract We propose a general method for the Bayesian estimation of a very broad class of non-linear no-arbitrage term-structure models. The main innovation we introduce is a computationally efficient method, based on deep learning techniques, for approximating no-arbitrage model-implied bond yields to any desired degree of accuracy. Once the pricing function is approximated, the posterior distribution of model parameters and unobservable state variables can be estimated by standard Markov Chain Monte Carlo methods. As an illustrative example, we apply the proposed techniques to the estimation of a shadow-rate model with a time-varying lower bound and unspanned macroeconomic factors.


Author(s):  
Christopher J. Arthurs ◽  
Nan Xiao ◽  
Philippe Moireau ◽  
Tobias Schaeffter ◽  
C. Alberto Figueroa

AbstractA major challenge in constructing three dimensional patient specific hemodynamic models is the calibration of model parameters to match patient data on flow, pressure, wall motion, etc. acquired in the clinic. Current workflows are manual and time-consuming. This work presents a flexible computational framework for model parameter estimation in cardiovascular flows that relies on the following fundamental contributions. (i) A Reduced-Order Unscented Kalman Filter (ROUKF) model for data assimilation for wall material and simple lumped parameter network (LPN) boundary condition model parameters. (ii) A constrained least squares augmentation (ROUKF-CLS) for more complex LPNs. (iii) A “Netlist” implementation, supporting easy filtering of parameters in such complex LPNs. The ROUKF algorithm is demonstrated using non-invasive patient-specific data on anatomy, flow and pressure from a healthy volunteer. The ROUKF-CLS algorithm is demonstrated using synthetic data on a coronary LPN. The methods described in this paper have been implemented as part of the CRIMSON hemodynamics software package.


2020 ◽  
Vol 70 (1) ◽  
pp. 145-161 ◽  
Author(s):  
Marnus Stoltz ◽  
Boris Baeumer ◽  
Remco Bouckaert ◽  
Colin Fox ◽  
Gordon Hiscott ◽  
...  

Abstract We describe a new and computationally efficient Bayesian methodology for inferring species trees and demographics from unlinked binary markers. Likelihood calculations are carried out using diffusion models of allele frequency dynamics combined with novel numerical algorithms. The diffusion approach allows for analysis of data sets containing hundreds or thousands of individuals. The method, which we call Snapper, has been implemented as part of the BEAST2 package. We conducted simulation experiments to assess numerical error, computational requirements, and accuracy recovering known model parameters. A reanalysis of soybean SNP data demonstrates that the models implemented in Snapp and Snapper can be difficult to distinguish in practice, a characteristic which we tested with further simulations. We demonstrate the scale of analysis possible using a SNP data set sampled from 399 fresh water turtles in 41 populations. [Bayesian inference; diffusion models; multi-species coalescent; SNP data; species trees; spectral methods.]


2013 ◽  
Vol 135 (12) ◽  
Author(s):  
Arun V. Kolanjiyil ◽  
Clement Kleinstreuer

This is the second article of a two-part paper, combining high-resolution computer simulation results of inhaled nanoparticle deposition in a human airway model (Kolanjiyil and Kleinstreuer, 2013, “Nanoparticle Mass Transfer From Lung Airways to Systemic Regions—Part I: Whole-Lung Aerosol Dynamics,” ASME J. Biomech. Eng., 135(12), p. 121003) with a new multicompartmental model for insoluble nanoparticle barrier mass transfer into systemic regions. Specifically, it allows for the prediction of temporal nanoparticle accumulation in the blood and lymphatic systems and in organs. The multicompartmental model parameters were determined from experimental retention and clearance data in rat lungs and then the validated model was applied to humans based on pharmacokinetic cross-species extrapolation. This hybrid simulator is a computationally efficient tool to predict the nanoparticle kinetics in the human body. The study provides critical insight into nanomaterial deposition and distribution from the lungs to systemic regions. The quantitative results are useful in diverse fields such as toxicology for exposure-risk analysis of ubiquitous nanomaterial and pharmacology for nanodrug development and targeting.


2014 ◽  
Vol 7 (1) ◽  
pp. 1535-1600
Author(s):  
M. Scherstjanoi ◽  
J. O. Kaplan ◽  
H. Lischke

Abstract. To be able to simulate climate change effects on forest dynamics over the whole of Switzerland, we adapted the second generation DGVM LPJ-GUESS to the Alpine environment. We modified model functions, tuned model parameters, and implemented new tree species to represent the potential natural vegetation of Alpine landscapes. Furthermore, we increased the computational efficiency of the model to enable area-covering simulations in a fine resolution (1 km) sufficient for the complex topography of the Alps, which resulted in more than 32 000 simulation grid cells. To this aim, we applied the recently developed method GAPPARD (Scherstjanoi et al., 2013) to LPJ-GUESS. GAPPARD derives mean output values from a combination of simulation runs without disturbances and a patch age distribution defined by the disturbance frequency. With this computationally efficient method, that increased the model's speed by approximately the factor 8, we were able to faster detect shortcomings of LPJ-GUESS functions and parameters. We used the adapted LPJ-GUESS together with GAPPARD to assess the influence of one climate change scenario on dynamics of tree species composition and biomass throughout the 21st century in Switzerland. To allow for comparison with the original model, we additionally simulated forest dynamics along a north-south-transect through Switzerland. The results from this transect confirmed the high value of the GAPPARD method despite some limitations towards extreme climatic events. It allowed for the first time to obtain area-wide, detailed high resolution LPJ-GUESS simulation results for a large part of the Alpine region.


2015 ◽  
Author(s):  
Randall Peterman

An informal review of the history of new quantitative methods in environmental science, including environmental risk assessment, shows about a 10- to 20-year lag in wide acceptance of such methods by management agencies. To reduce that lag time as innovative methods continue to emerge, environmental scientists will need to work much more intensively with communications specialists on better ways to explain risk analyses and decision-making strategies to non-technical decision makers and the public. Four key uncertainties make such communication difficult: (1) natural variability in both physical and biological processes, (2) imperfect data arising from observation error (i.e., measurement error), (3) incomplete understanding of an environmental system's structure and dynamics, and (4) outcome uncertainty (deviations between realized outcomes and management targets). These uncertainties create risks -- risks to natural populations as well as to people who use them. Examples of these four sources of uncertainty are presented here for Pacific salmon (Oncorhynchus spp.). One promising framework for explicitly taking such uncertainties into account was initially developed in the early 1990s by scientific advisors to the International Whaling Commission. They built stochastic models, which essentially were comprehensive formal decision analyses, to derive management procedures (i.e., sampling designs for collecting data, methods to analyze those data, and state-dependent harvest-control rules for use by managers) that were robust to all the uncertainties considered. This method of "Management Strategy Evaluation" or "Management Procedure Evaluation" is now considered the "gold standard" for conducting risk assessments and making risk-management decisions in marine fisheries.


2019 ◽  
Vol 3 (1) ◽  
pp. 1-11
Author(s):  
Vivi Pancasari Kusumawardani

This study aims to determine the procedure for managing financial statements at the Kapuas District Office of Youth, Sports, Culture and Tourism.The data used in this study are documentary data, while the data sources used are secondary data with data collection technique in the form of documentation technique, namely collecting written material in the form of data obtained from the Finance Department of the Department of Youth, Sports, Culture and Tourism Kapuas Year 2015 and 2016. Data analysis technique use qualitative analysis technique.The results showed that: (1) the Kapuas District Youth, Sports, Culture and Tourism Office had financial management procedures as a guideline governing the financial management process that covered all financial aspects managed by the Kapuas District Culture, Youth and Sports Service. (2) In the financial management procedure owned by the Department of Youth, Sports, Culture and Tourism of the Kapuas Regency, it has complied with government regulations stipulated in the Minister of Home Affairs Regulation Number 13 of 2006.


Author(s):  
Y Chen ◽  
C Muratov ◽  
V Matveev

ABSTRACTWe consider the stationary solution for the Ca2+ concentration near a point Ca2+ source describing a single-channel Ca2+ nanodomain, in the presence of a single mobile Ca2+ buffer with one-to-one Ca2+ binding. We present computationally efficient approximants that estimate stationary single-channel Ca2+ nanodomains with great accuracy in broad regions of parameter space. The presented approximants have a functional form that combines rational and exponential functions, which is similar to that of the well-known Excess Buffer Approximation and the linear approximation, but with parameters estimated using two novel (to our knowledge) methods. One of the methods involves interpolation between the short-range Taylor series of the buffer concentration and its long-range asymptotic series in inverse powers of distance from the channel. Although this method has already been used to find Padé (rational-function) approximants to single-channel Ca2+ and buffer concentration, extending this method to interpolants combining exponential and rational functions improves accuracy in a significant fraction of the relevant parameter space. A second method is based on the variational approach, and involves a global minimization of an appropriate functional with respect to parameters of the chosen approximations. Extensive parameter sensitivity analysis is presented, comparing these two methods with previously developed approximants. Apart from increased accuracy, the strength of these approximants is that they can be extended to more realistic buffers with multiple binding sites characterized by cooperative Ca2+ binding, such as calmodulin and calretinin.STATEMENT OF SIGNIFICANCEMathematical and computational modeling plays an important role in the study of local Ca2+ signals underlying vesicle exocysosis, muscle contraction and other fundamental physiological processes. Closed-form approximations describing steady-state distribution of Ca2+ in the vicinity of an open Ca2+ channel have proved particularly useful for the qualitative modeling of local Ca2+ signals. We present simple and efficient approximants for the Ca2+ concentration in the presence of a mobile Ca2+ buffer, which achieve great accuracy over a wide range of model parameters. Such approximations provide an efficient method for estimating Ca2+ and buffer concentrations without resorting to numerical simulations, and allow to study the qualitative dependence of nanodomain Ca2+ distribution on the buffer’s Ca2+ binding properties and its diffusivity.


2016 ◽  
Vol 12 (S325) ◽  
pp. 39-45 ◽  
Author(s):  
Maria Süveges ◽  
Sotiria Fotopoulou ◽  
Jean Coupon ◽  
Stéphane Paltani ◽  
Laurent Eyer ◽  
...  

AbstractThroughout the processing and analysis of survey data, a ubiquitous issue nowadays is that we are spoilt for choice when we need to select a methodology for some of its steps. The alternative methods usually fail and excel in different data regions, and have various advantages and drawbacks, so a combination that unites the strengths of all while suppressing the weaknesses is desirable. We propose to use a two-level hierarchy of learners. Its first level consists of training and applying the possible base methods on the first part of a known set. At the second level, we feed the output probability distributions from all base methods to a second learner trained on the remaining known objects. Using classification of variable stars and photometric redshift estimation as examples, we show that the hierarchical combination is capable of achieving general improvement over averaging-type combination methods, correcting systematics present in all base methods, is easy to train and apply, and thus, it is a promising tool in the astronomical “Big Data” era.


Sign in / Sign up

Export Citation Format

Share Document