scholarly journals Adaptational lags during periods of environmental change

2019 ◽  
Author(s):  
Tom J. M. Van Dooren

AbstractEffects of climate change can be handled by means of mitigation and adaptation. In the biological sciences, adaptations are evolved solutions of engineering problems where organisms need to match an ecological challenge. Based on Adaptive Dynamics theory, a definition is proposed of adapted states and adaptational lags which is applicable during periods with environmental change of any speed and to any character. Adaptation can thus be studied even when it emerges from complex eco-evolutionary processes and targets for adaptation are not defined or known a priori. The approach is exemplified with a model for delayed germination (germination probability) in an annual plant, which is the classic life history example for adaptation to uncertain environments. Plasticity and maternal effects are added to the model to investigate lags in these modes of trait determination which are often presumed to be adaptive. In the example, adaptational lags are not converging to an equilibrium and change sign. For the model version with plasticity and maternal effect weights, the presence of a lag in these trait components can temporarily change the direction of selection on the genotypic weight. Adaptational lag is related to the establishment probability of mutants in the model example. It could therefore have practical relevance. A first general classification is proposed of model structures that include both adaptive control and evolutionary adaptation.

2018 ◽  
Vol 115 (13) ◽  
pp. 3210-3218 ◽  
Author(s):  
John Haldon ◽  
Lee Mordechai ◽  
Timothy P. Newfield ◽  
Arlen F. Chase ◽  
Adam Izdebski ◽  
...  

History and archaeology have a well-established engagement with issues of premodern societal development and the interaction between physical and cultural environments; together, they offer a holistic view that can generate insights into the nature of cultural resilience and adaptation, as well as responses to catastrophe. Grasping the challenges that climate change presents and evolving appropriate policies that promote and support mitigation and adaptation requires not only an understanding of the science and the contemporary politics, but also an understanding of the history of the societies affected and in particular of their cultural logic. But whereas archaeologists have developed productive links with the paleosciences, historians have, on the whole, remained muted voices in the debate until recently. Here, we suggest several ways in which a consilience between the historical sciences and the natural sciences, including attention to even distant historical pasts, can deepen contemporary understanding of environmental change and its effects on human societies.


2018 ◽  
Vol 66 (3) ◽  
pp. 303-315 ◽  
Author(s):  
Alberto Viglione ◽  
Magdalena Rogger ◽  
Herbert Pirkl ◽  
Juraj Parajka ◽  
Günter Blöschl

Abstract Since the beginning of hydrological research hydrologists have developed models that reflect their perception about how the catchments work and make use of the available information in the most efficient way. In this paper we develop hydrologic models based on field-mapped runoff generation mechanisms as identified by a geologist. For four different catchments in Austria, we identify four different lumped model structures and constrain their parameters based on the field-mapped information. In order to understand the usefulness of geologic information, we test their capability to predict river discharge in different cases: (i) without calibration and (ii) using the standard split-sample calibration/ validation procedure. All models are compared against each other. Results show that, when no calibration is involved, using the right model structure for the catchment of interest is valuable. A-priori information on model parameters does not always improve the results but allows for more realistic model parameters. When all parameters are calibrated to the discharge data, the different model structures do not matter, i.e., the differences can largely be compensated by the choice of parameters. When parameters are constrained based on field-mapped runoff generation mechanisms, the results are not better but more consistent between different calibration periods. Models selected by runoff generation mechanisms are expected to be more robust and more suitable for extrapolation to conditions outside the calibration range than models that are purely based on parameter calibration to runoff data.


2020 ◽  
Vol 69 ◽  
pp. 297-342
Author(s):  
Jacopo Banfi ◽  
Vikram Shree ◽  
Mark Campbell

This paper introduces and studies a graph-based variant of the path planning problem arising in hostile environments. We consider a setting where an agent (e.g. a robot) must reach a given destination while avoiding being intercepted by probabilistic entities which exist in the graph with a given probability and move according to a probabilistic motion pattern known a priori. Given a goal vertex and a deadline to reach it, the agent must compute the path to the goal that maximizes its chances of survival. We study the computational complexity of the problem, and present two algorithms for computing high quality solutions in the general case: an exact algorithm based on Mixed-Integer Nonlinear Programming, working well in instances of moderate size, and a pseudo-polynomial time heuristic algorithm allowing to solve large scale problems in reasonable time. We also consider the two limit cases where the agent can survive with probability 0 or 1, and provide specialized algorithms to detect these kinds of situations more efficiently.


Sensors ◽  
2021 ◽  
Vol 21 (16) ◽  
pp. 5350
Author(s):  
Carlos Crespo-Cadenas ◽  
María José Madero-Ayora ◽  
Juan A. Becerra

This work presents a strategy to upgrade models for power amplifier (PA) behavioral modeling and digital predistortion (DPD). These incomplete structures are the consequence of nonlinear order and memory depth model truncation with the purpose of reducing the demand of the limited computational resources available in standard processors. On the other hand, the alternative use of model structures pruned a priori does not guarantee that every significant term is included. To improve the limited performance of an incomplete model, a general procedure to augment its structure by incorporating significant terms is demonstrated. The sparse nature of the problem allows a successive search incorporating additional terms with higher nonlinear order and memory depth. This approach is investigated in the modeling and linearization of a commercial class AB PA operating at a compression point of about 6 dB, and a class J PA operating near saturation. Results highlight the capabilities of this upgrading procedure in the improvement of linearization capabilities of DPDs.


2018 ◽  
Vol 38 (2-3) ◽  
pp. 146-161 ◽  
Author(s):  
Ryan A MacDonald ◽  
Stephen L Smith

This paper addresses path planning with real-time reaction to environmental uncertainty. The environment is represented as a robotic roadmap, or graph, and is uncertain in that the edges of the graph are unknown to the robot a priori. Instead, the robot’s prior information consists of a distribution over candidate edge sets, modeling the likelihood of certain obstacles in the environment. The robot can locally sense the environment, and at a vertex, can determine the presence or absence of some subset of edges. Within this model, the reactive planning problem provides the robot with a start location and a goal location and asks it to compute a policy that minimizes the expected travel and observation cost. In contrast to computing paths that maximize the probability of success, we focus on complete policies (i.e., policies that are guaranteed to navigate the robot to the goal or determine no such path exists). We prove that the problem is NP-hard and provide a suboptimal, but computationally efficient solution. This solution, based on mutual information, returns a complete policy and a bound on the gap between the policy’s expected cost and the optimal. We test the performance of the policy and the lower bound against that of the optimal policy and explore the effects of errors in the robot’s prior information on performance. Simulations are run on a flexible factory scenario to demonstrate the scalability of the proposed approach. Finally, we present a method to extend this solution to robots with faulty sensors.


2014 ◽  
Vol 37 (2) ◽  
pp. 141-143
Author(s):  
A. Martínez-Abrain ◽  
◽  
D. Conesa ◽  
A. Forte ◽  
◽  
...  

We approach here the handling of previous information when performing statistical inference in ecology, both when dealing with model specification and selection, and when dealing with parameter estimation. We compare the perspectives of this problem from the frequentist and Bayesian schools, including objective and subjective Bayesians. We show that the issue of making use of previous information and making a priori decisions is not only a reality for Bayesians but also for frequentists. However, the latter tend to overlook this because of the common difficulty of having previous information available on the magnitude of the effect that is thought to be biologically relevant. This prior information should be fed into a priori power tests when looking for the necessary sample sizes to couple statistical and biological significances. Ecologists should make a greater effort to make use of available prior information because this is their most legitimate contribution to the inferential process. Parameter estimation and model selection would benefit if this was done, allowing a more reliable accumulation of knowledge, and hence progress, in the biological sciences.


Author(s):  
Harikumar Iyer ◽  
Xiao Tang ◽  
Sundar Krishnamurty

Abstract This paper deals with two major issues that are central to the development and implementation of decision-based approaches to engineering design, namely, the treatment of constraints and accurate preference representation. In this paper, a decision analysis based constraint handling technique is introduced that will be particularly useful in constrained engineering problems with multiple attributes. Recognizing the iterative nature of engineering design where design alternatives may not always be known a priori (unlike in traditional engineering design), this paper introduces the concept of knowledge-influenced attribute model building as a means to update preference representation for the purposes of accurately reflecting designer’s intent and purposes throughout. the evolving design process. These two concepts are then used to extend the TRED (Trade-off based Robust Engineering Design) framework that has been developed as a formal design strategy through the integration of utility theory, based multiattribute models into a Taguchi philosophy based design of experiments setup. Here, to find robust optimal solutions to constrained engineering design problems, this paper presents the development of a second order design space reduction technique based on Response Surface Methodology (RSM). Its application to engineering problems is illustrated through a simple case study and the results are discussed.


2020 ◽  
Author(s):  
Hazhir Rahmandad ◽  
Michael Shayne Gary

With so many possible choices, why do managers adopt the strategies they do? We identify delays between adopting a strategy and observing the full implications of that choice as a critical factor influencing strategic choices. Using a simulation of a service firm, we conduct two behavioral experiments to investigate how delays interact with outcome uncertainty to shape learning, strategy adaptation, and performance outcomes. Two mechanisms emerge from how different subject groups perceive, react to, and learn in the presence of delayed feedback and uncertainty. First, when multiple viable strategies exist, longer delays lead both general participants and experienced managers toward alternatives that have rapid returns. When those alternatives are suboptimal, delays may strengthen convergence to inefficient strategies. Second, delays and uncertainty may also induce learners to persist with their a priori strategies. Managers show larger confidence in their priors and thus underperform general participants when the underlying task structure diverges from those priors. Both mechanisms can undermine performance. Moreover, delays and uncertainty may reduce heterogeneity in strategies and performance in more dynamic, uncertain environments, leading to convergence as tasks grow more complex and where decision makers possess similar priors.


Sign in / Sign up

Export Citation Format

Share Document