scholarly journals Generalist–specialist trade-off during thermal acclimation

2015 ◽  
Vol 2 (1) ◽  
pp. 140251 ◽  
Author(s):  
Frank Seebacher ◽  
Varlérie Ducret ◽  
Alexander G. Little ◽  
Bart Adriaenssens

The shape of performance curves and their plasticity define how individuals and populations respond to environmental variability. In theory, maximum performance decreases with an increase in performance breadth. However, reversible acclimation may counteract this generalist–specialist trade-off, because performance optima track environmental conditions so that there is no benefit of generalist phenotypes. We tested this hypothesis by acclimating individual mosquitofish ( Gambusia holbrooki ) to cool and warm temperatures consecutively and measuring performance curves of swimming performance after each acclimation treatment. Individuals from the same population differed significantly in performance maxima, performance breadth and the capacity for acclimation. As predicted, acclimation resulted in a shift of the temperature at which maximal performance occurred. Within acclimation treatments, there was a significant generalist–specialist trade-off in responses to acute temperature change. Surprisingly, however, there was also a trade-off across acclimation treatments, and animals with greater capacity for cold acclimation had lower performance maxima under warm conditions. Hence, cold acclimation may be viewed as a generalist strategy that extends performance breadth at the colder seasons, but comes at the cost of reduced performance at the warmer time of year. Acclimation therefore does not counteract a generalist–specialist trade-off and, at least in mosquitofish, the trade-off seems to be a system property that persists despite phenotypic plasticity.

2020 ◽  
Vol 4 (02) ◽  
pp. 34-45
Author(s):  
Naufal Dzikri Afifi ◽  
Ika Arum Puspita ◽  
Mohammad Deni Akbar

Shift to The Front II Komplek Sukamukti Banjaran Project is one of the projects implemented by one of the companies engaged in telecommunications. In its implementation, each project including Shift to The Front II Komplek Sukamukti Banjaran has a time limit specified in the contract. Project scheduling is an important role in predicting both the cost and time in a project. Every project should be able to complete the project before or just in the time specified in the contract. Delay in a project can be anticipated by accelerating the duration of completion by using the crashing method with the application of linear programming. Linear programming will help iteration in the calculation of crashing because if linear programming not used, iteration will be repeated. The objective function in this scheduling is to minimize the cost. This study aims to find a trade-off between the costs and the minimum time expected to complete this project. The acceleration of the duration of this study was carried out using the addition of 4 hours of overtime work, 3 hours of overtime work, 2 hours of overtime work, and 1 hour of overtime work. The normal time for this project is 35 days with a service fee of Rp. 52,335,690. From the results of the crashing analysis, the alternative chosen is to add 1 hour of overtime to 34 days with a total service cost of Rp. 52,375,492. This acceleration will affect the entire project because there are 33 different locations worked on Shift to The Front II and if all these locations can be accelerated then the duration of completion of the entire project will be effective


2020 ◽  
Vol 12 (7) ◽  
pp. 2767 ◽  
Author(s):  
Víctor Yepes ◽  
José V. Martí ◽  
José García

The optimization of the cost and CO 2 emissions in earth-retaining walls is of relevance, since these structures are often used in civil engineering. The optimization of costs is essential for the competitiveness of the construction company, and the optimization of emissions is relevant in the environmental impact of construction. To address the optimization, black hole metaheuristics were used, along with a discretization mechanism based on min–max normalization. The stability of the algorithm was evaluated with respect to the solutions obtained; the steel and concrete values obtained in both optimizations were analyzed. Additionally, the geometric variables of the structure were compared. Finally, the results obtained were compared with another algorithm that solved the problem. The results show that there is a trade-off between the use of steel and concrete. The solutions that minimize CO 2 emissions prefer the use of concrete instead of those that optimize the cost. On the other hand, when comparing the geometric variables, it is seen that most remain similar in both optimizations except for the distance between buttresses. When comparing with another algorithm, the results show a good performance in optimization using the black hole algorithm.


Author(s):  
Vincent E. Castillo ◽  
John E. Bell ◽  
Diane A. Mollenkopf ◽  
Theodore P. Stank

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jeonghyuk Park ◽  
Yul Ri Chung ◽  
Seo Taek Kong ◽  
Yeong Won Kim ◽  
Hyunho Park ◽  
...  

AbstractThere have been substantial efforts in using deep learning (DL) to diagnose cancer from digital images of pathology slides. Existing algorithms typically operate by training deep neural networks either specialized in specific cohorts or an aggregate of all cohorts when there are only a few images available for the target cohort. A trade-off between decreasing the number of models and their cancer detection performance was evident in our experiments with The Cancer Genomic Atlas dataset, with the former approach achieving higher performance at the cost of having to acquire large datasets from the cohort of interest. Constructing annotated datasets for individual cohorts is extremely time-consuming, with the acquisition cost of such datasets growing linearly with the number of cohorts. Another issue associated with developing cohort-specific models is the difficulty of maintenance: all cohort-specific models may need to be adjusted when a new DL algorithm is to be used, where training even a single model may require a non-negligible amount of computation, or when more data is added to some cohorts. In resolving the sub-optimal behavior of a universal cancer detection model trained on an aggregate of cohorts, we investigated how cohorts can be grouped to augment a dataset without increasing the number of models linearly with the number of cohorts. This study introduces several metrics which measure the morphological similarities between cohort pairs and demonstrates how the metrics can be used to control the trade-off between performance and the number of models.


2020 ◽  
Vol 15 (1) ◽  
pp. 4-17
Author(s):  
Jean-François Biasse ◽  
Xavier Bonnetain ◽  
Benjamin Pring ◽  
André Schrottenloher ◽  
William Youmans

AbstractWe propose a heuristic algorithm to solve the underlying hard problem of the CSIDH cryptosystem (and other isogeny-based cryptosystems using elliptic curves with endomorphism ring isomorphic to an imaginary quadratic order 𝒪). Let Δ = Disc(𝒪) (in CSIDH, Δ = −4p for p the security parameter). Let 0 < α < 1/2, our algorithm requires:A classical circuit of size $2^{\tilde{O}\left(\log(|\Delta|)^{1-\alpha}\right)}.$A quantum circuit of size $2^{\tilde{O}\left(\log(|\Delta|)^{\alpha}\right)}.$Polynomial classical and quantum memory.Essentially, we propose to reduce the size of the quantum circuit below the state-of-the-art complexity $2^{\tilde{O}\left(\log(|\Delta|)^{1/2}\right)}$ at the cost of increasing the classical circuit-size required. The required classical circuit remains subexponential, which is a superpolynomial improvement over the classical state-of-the-art exponential solutions to these problems. Our method requires polynomial memory, both classical and quantum.


Author(s):  
Henk Ernst Blok ◽  
Djoerd Hiemstra ◽  
Sunil Choenni ◽  
Franciska de Jong ◽  
Henk M. Blanken ◽  
...  

2016 ◽  
Vol 3 (10) ◽  
pp. 160406 ◽  
Author(s):  
Gil Iosilevskii ◽  
Yannis P. Papastamatiou

Sharks have a distinctive shape that remained practically unchanged through hundreds of millions of years of evolution. Nonetheless, there are variations of this shape that vary between and within species. We attempt to explain these variations by examining the partial derivatives of the cost of transport of a generic shark with respect to buoyancy, span and chord of its pectoral fins, length, girth and body temperature. Our analysis predicts an intricate relation between these parameters, suggesting that ectothermic species residing in cooler temperatures must either have longer pectoral fins and/or be more buoyant in order to maintain swimming performance. It also suggests that, in general, the buoyancy must increase with size, and therefore, there must be ontogenetic changes within a species, with individuals getting more buoyant as they grow. Pelagic species seem to have near optimally sized fins (which minimize the cost of transport), but the majority of reef sharks could have reduced the cost of transport by increasing the size of their fins. The fact that they do not implies negative selection, probably owing to decreased manoeuvrability in confined spaces (e.g. foraging on a reef).


Author(s):  
YAODONG NI ◽  
QIAONI SHI

In this paper, we study the problem of targeting a set of individuals to trigger a cascade of influence in a social network such that the influence diffuses to all individuals with the minimum time, given that the cost of initially influencing each individual is with randomness and that the budget available is limited. We adopt the incremental chance model to characterize the diffusion of influence, and propose three stochastic programming models that correspond to three different decision criteria respectively. A modified greedy algorithm is presented to solve the proposed models, which can flexibly trade off between solution performance and computational complexity. Experiments are performed on random graphs, by which we show that the algorithm we present is effective.


Sign in / Sign up

Export Citation Format

Share Document