scholarly journals Environment-coupled models of leaf metabolism

Author(s):  
Nadine Töpfer

The plant leaf is the main site of photosynthesis. This process converts light energy and inorganic nutrients into chemical energy and organic building blocks for the biosynthesis and maintenance of cellular components and to support the growth of the rest of the plant. The leaf is also the site of gas–water exchange and due to its large surface, it is particularly vulnerable to pathogen attacks. Therefore, the leaf's performance and metabolic modes are inherently determined by its interaction with the environment. Mathematical models of plant metabolism have been successfully applied to study various aspects of photosynthesis, carbon and nitrogen assimilation and metabolism, aided suggesting metabolic intervention strategies for optimized leaf performance, and gave us insights into evolutionary drivers of plant metabolism in various environments. With the increasing pressure to improve agricultural performance in current and future climates, these models have become important tools to improve our understanding of plant–environment interactions and to propel plant breeders efforts. This overview article reviews applications of large-scale metabolic models of leaf metabolism to study plant–environment interactions by means of flux-balance analysis. The presented studies are organized in two ways — by the way the environment interactions are modelled — via external constraints or data-integration and by the studied environmental interactions — abiotic or biotic.

2021 ◽  
Vol 22 (11) ◽  
pp. 5793
Author(s):  
Brianna M. Quinville ◽  
Natalie M. Deschenes ◽  
Alex E. Ryckman ◽  
Jagdeep S. Walia

Sphingolipids are a specialized group of lipids essential to the composition of the plasma membrane of many cell types; however, they are primarily localized within the nervous system. The amphipathic properties of sphingolipids enable their participation in a variety of intricate metabolic pathways. Sphingoid bases are the building blocks for all sphingolipid derivatives, comprising a complex class of lipids. The biosynthesis and catabolism of these lipids play an integral role in small- and large-scale body functions, including participation in membrane domains and signalling; cell proliferation, death, migration, and invasiveness; inflammation; and central nervous system development. Recently, sphingolipids have become the focus of several fields of research in the medical and biological sciences, as these bioactive lipids have been identified as potent signalling and messenger molecules. Sphingolipids are now being exploited as therapeutic targets for several pathologies. Here we present a comprehensive review of the structure and metabolism of sphingolipids and their many functional roles within the cell. In addition, we highlight the role of sphingolipids in several pathologies, including inflammatory disease, cystic fibrosis, cancer, Alzheimer’s and Parkinson’s disease, and lysosomal storage disorders.


2017 ◽  
Vol 21 (1) ◽  
pp. 117-132 ◽  
Author(s):  
Jannis M. Hoch ◽  
Arjen V. Haag ◽  
Arthur van Dam ◽  
Hessel C. Winsemius ◽  
Ludovicus P. H. van Beek ◽  
...  

Abstract. Large-scale flood events often show spatial correlation in neighbouring basins, and thus can affect adjacent basins simultaneously, as well as result in superposition of different flood peaks. Such flood events therefore need to be addressed with large-scale modelling approaches to capture these processes. Many approaches currently in place are based on either a hydrologic or a hydrodynamic model. However, the resulting lack of interaction between hydrology and hydrodynamics, for instance, by implementing groundwater infiltration on inundated floodplains, can hamper modelled inundation and discharge results where such interactions are important. In this study, the global hydrologic model PCR-GLOBWB at 30 arcmin spatial resolution was one-directionally and spatially coupled with the hydrodynamic model Delft 3D Flexible Mesh (FM) for the Amazon River basin at a grid-by-grid basis and at a daily time step. The use of a flexible unstructured mesh allows for fine-scale representation of channels and floodplains, while preserving a coarser spatial resolution for less flood-prone areas, thus not unnecessarily increasing computational costs. In addition, we assessed the difference between a 1-D channel/2-D floodplain and a 2-D schematization in Delft 3D FM. Validating modelled discharge results shows that coupling PCR-GLOBWB to a hydrodynamic routing scheme generally increases model performance compared to using a hydrodynamic or hydrologic model only for all validation parameters applied. Closer examination shows that the 1-D/2-D schematization outperforms 2-D for r2 and root mean square error (RMSE) whilst having a lower Kling–Gupta efficiency (KGE). We also found that spatial coupling has the significant advantage of a better representation of inundation at smaller streams throughout the model domain. A validation of simulated inundation extent revealed that only those set-ups incorporating 1-D channels are capable of representing inundations for reaches below the spatial resolution of the 2-D mesh. Implementing 1-D channels is therefore particularly of advantage for large-scale inundation models, as they are often built upon remotely sensed surface elevation data which often enclose a strong vertical bias, hampering downstream connectivity. Since only a one-directional coupling approach was tested, and therefore important feedback processes are not incorporated, simulated discharge and inundation extent for both coupled set-ups is generally overpredicted. Hence, it will be the subsequent step to extend it to a two-directional coupling scheme to obtain a closed feedback loop between hydrologic and hydrodynamic processes. The current findings demonstrating the potential of one-directionally and spatially coupled models to obtain improved discharge estimates form an important step towards a large-scale inundation model with a full dynamic coupling between hydrology and hydrodynamics.


2006 ◽  
Vol 19 (17) ◽  
pp. 4344-4359 ◽  
Author(s):  
Markus Stowasser ◽  
Kevin Hamilton

Abstract The relations between local monthly mean shortwave cloud radiative forcing and aspects of the resolved-scale meteorological fields are investigated in hindcast simulations performed with 12 of the global coupled models included in the model intercomparison conducted as part of the preparation for Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4). In particular, the connection of the cloud forcing over tropical and subtropical ocean areas with resolved midtropospheric vertical velocity and with lower-level relative humidity are investigated and compared among the models. The model results are also compared with observational determinations of the same relationships using satellite data for the cloud forcing and global reanalysis products for the vertical velocity and humidity fields. In the analysis the geographical variability in the long-term mean among all grid points and the interannual variability of the monthly mean at each grid point are considered separately. The shortwave cloud radiative feedback (SWCRF) plays a crucial role in determining the predicted response to large-scale climate forcing (such as from increased greenhouse gas concentrations), and it is thus important to test how the cloud representations in current climate models respond to unforced variability. Overall there is considerable variation among the results for the various models, and all models show some substantial differences from the comparable observed results. The most notable deficiency is a weak representation of the cloud radiative response to variations in vertical velocity in cases of strong ascending or strong descending motions. While the models generally perform better in regimes with only modest upward or downward motions, even in these regimes there is considerable variation among the models in the dependence of SWCRF on vertical velocity. The largest differences between models and observations when SWCRF values are stratified by relative humidity are found in either very moist or very dry regimes. Thus, the largest errors in the model simulations of cloud forcing are prone to be in the western Pacific warm pool area, which is characterized by very moist strong upward currents, and in the rather dry regions where the flow is dominated by descending mean motions.


2021 ◽  
Author(s):  
Kor de Jong ◽  
Marc van Kreveld ◽  
Debabrata Panja ◽  
Oliver Schmitz ◽  
Derek Karssenberg

<p>Data availability at global scale is increasing exponentially. Although considerable challenges remain regarding the identification of model structure and parameters of continental scale hydrological models, we will soon reach the situation that global scale models could be defined at very high resolutions close to 100 m or less. One of the key challenges is how to make simulations of these ultra-high resolution models tractable ([1]).</p><p>Our research contributes by the development of a model building framework that is specifically designed to distribute calculations over multiple cluster nodes. This framework enables domain experts like hydrologists to develop their own large scale models, using a scripting language like Python, without the need to acquire the skills to develop low-level computer code for parallel and distributed computing.</p><p>We present the design and implementation of this software framework and illustrate its use with a prototype 100 m, 1 h continental scale hydrological model. Our modelling framework ensures that any model built with it is parallelized. This is made possible by providing the model builder with a set of building blocks of models, which are coded in such a manner that parallelization of calculations occurs within and across these building blocks, for any combination of building blocks. There is thus full flexibility on the side of the modeller, without losing performance.</p><p>This breakthrough is made possible by applying a novel approach to the implementation of the model building framework, called asynchronous many-tasks, provided by the HPX C++ software library ([3]). The code in the model building framework expresses spatial operations as large collections of interdependent tasks that can be executed efficiently on individual laptops as well as computer clusters ([2]). Our framework currently includes the most essential operations for building large scale hydrological models, including those for simulating transport of material through a flow direction network. By combining these operations, we rebuilt an existing 100 m, 1 h resolution model, thus far used for simulations of small catchments, requiring limited coding as we only had to replace the computational back end of the existing model. Runs at continental scale on a computer cluster show acceptable strong and weak scaling providing a strong indication that global simulations at this resolution will soon be possible, technically speaking.</p><p>Future work will focus on extending the set of modelling operations and adding scalable I/O, after which existing models that are currently limited in their ability to use the computational resources available to them can be ported to this new environment.</p><p>More information about our modelling framework is at https://lue.computationalgeography.org.</p><p><strong>References</strong></p><p>[1] M. Bierkens. Global hydrology 2015: State, trends, and directions. Water Resources Research, 51(7):4923–4947, 2015.<br>[2] K. de Jong, et al. An environmental modelling framework based on asynchronous many-tasks: scalability and usability. Submitted.<br>[3] H. Kaiser, et al. HPX - The C++ standard library for parallelism and concurrency. Journal of Open Source Software, 5(53):2352, 2020.</p>


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Steve Noone ◽  
Alison Branch ◽  
Melissa Sherring

Purpose Positive behavioural support (PBS) as a framework for delivering quality services is recognised in important policy documents (CQC, 2020; NICE, 2018), yet there is an absence in the literature on how this could be implemented on a large scale. The purpose of this paper is to describe a recent implementation of a workforce strategy to develop PBS across social care and health staff and family carers, within the footprint of a large integrated care system. Design/methodology/approach A logic model describes how an initial scoping exercise led to the production of a regional workforce strategy based on the PBS Competence Framework (2015). It shows how the creation of a regional steering group was able to coordinate important developmental stages and integrate multiple agencies into a single strategy to implement teaching and education in PBS. It describes the number of people who received teaching and education in PBS and the regional impact of the project in promoting cultural change within services. Findings This paper demonstrates a proof of concept that it is possible to translate the PBS Competency Framework (2015) into accredited courses. Initial scoping work highlighted the ineffectiveness of traditional training in PBS. Using blended learning and competency-based supervision and assessment, it was possible to create a new way to promote large-scale service developments in PBS supported by the governance of a new organisational structure. This also included family training delivered by family trainers. This builds on the ideas by Denne et al. (2020) that many of the necessary building blocks of implementation already exist within a system. Social implications A co-ordinated teaching and education strategy in PBS may help a wide range of carers to become more effective in supporting the people they care for. Originality/value This is the first attempt to describe the implementation of a framework for PBS within a defined geographical location. It describes the collaboration of health and social care planners and a local university to create a suite of courses built around the PBS coalition competency framework.


Author(s):  
S. E. Ubi

The use of polystyrene beads in concrete applications has been limited due to its perceived low strength properties. Tensile strength test is an important test that determines the vulnerability of concrete to tensile cracking due to the weight of the structural load. Water, sand, coarse aggregates, expanded polystyrene beads, and ordinary Portland cement are the materials used for this study. All the materials were batched according to their weight, except for polystyrene and coarse aggregates which were batched in volume after mixing them together. The polystyrene partial replacement level was considered at 12% of the coarse aggregate volume. The model equation adopted for this study was based on Scheffe’s {4, 2} simplex lattice design for both Pseudo component and component proportional models. The actual model was developed from the 28th day test result. The Mathlab and Minitab 16 software were used in this study to generate the actual mix ratios. The results obtained showed that both Pseudo component and component proportional models both produced an average split tensile strength of about 5.10N/mm2. This implied that the results of this study produced a split tensile strength result that varied between 18% - 19% of its compressive strength result. This showed that the materials and the mix ratios optimized in this study are suitable as building blocks for residential low rising buildings and as partition slaps for high rising buildings. The lightweight property makes it highly suitable for large scale application in high rising structures as internal partition slaps only.


1990 ◽  
Vol 68 (9) ◽  
pp. 799-807
Author(s):  
Joseph Silk

Ever since the epoch of the spontaneous breaking of grand unification symmetry between the nuclear and electromagnetic interactions, the universe has expanded under the imprint of a spectrum of density fluctuations that is generally considered to have originated in this phase transition. I will discuss various possibilities for the form of the primordial fluctuation spectrum, spanning the range of adiabatic fluctuations, isocurvature fluctuations, and cosmic strings. Growth of the seed fluctuations by gravitational instability generates the formation of large-scale structures, from the scale of galaxies to that of clusters and superclusters of galaxies. There are three areas of confrontation with observational cosmology that will be reviewed. The large-scale distribution of the galaxies, including the apparent voids, sheets and filaments, and the coherent peculiar velocity field on scales of several tens of megaparsecs, probe the primordial fluctuation spectrum on scales that are only mildly nonlinear. Even larger scales are probed by study of the anisotropy of the cosmic microwave background radiation, which provides a direct glimpse of the primordial fluctuations that existed about 106 years or so after the initial big bang singularity. Galaxy formation is the process by which the building blocks of the universe have formed, involving a complex interaction between hydrodynamical and dynamical processes in a collapsing gas cloud. Both by detection of forming galaxies in the most remote regions of the universe and by study of the fundamental morphological characteristics of galaxies, which provide a fossilized memory of their past, can one relate the origin of galaxies to the same primordial fluctuation spectrum that gave rise' to the large-scale structure of the universe.


2018 ◽  
Vol 31 (8) ◽  
pp. 3249-3264 ◽  
Author(s):  
Michael P. Byrne ◽  
Tapio Schneider

AbstractThe regional climate response to radiative forcing is largely controlled by changes in the atmospheric circulation. It has been suggested that global climate sensitivity also depends on the circulation response, an effect called the “atmospheric dynamics feedback.” Using a technique to isolate the influence of changes in atmospheric circulation on top-of-the-atmosphere radiation, the authors calculate the atmospheric dynamics feedback in coupled climate models. Large-scale circulation changes contribute substantially to all-sky and cloud feedbacks in the tropics but are relatively less important at higher latitudes. Globally averaged, the atmospheric dynamics feedback is positive and amplifies the near-surface temperature response to climate change by an average of 8% in simulations with coupled models. A constraint related to the atmospheric mass budget results in the dynamics feedback being small on large scales relative to feedbacks associated with thermodynamic processes. Idealized-forcing simulations suggest that circulation changes at high latitudes are potentially more effective at influencing global temperature than circulation changes at low latitudes, and the implications for past and future climate change are discussed.


2021 ◽  
Author(s):  
Kristian Mikalsen

Abstract This paper demonstrates a pioneering technology adaption for using a membrane-based subsea storage solution for oil/condensate, modified into storing clean energy storage in the form of ammonia (as a hydrogen energy carrier). The immediate application will provide an economical alternative to electrification of offshore platforms, instead of using expensive cables from shore. Storing ammonia at the seabed using innovative subsea storage technologies will dramatically reduce CO2 emissions for offshore assets. The fluid will be stored in a safe manner on the seafloor, protecting both personnel and marine life. The next step will be to include subsea ammonia storage as part of the global logistical value chain, which can power the merchant shipping fleet. Clean ammonia can be produced using renewable resources as wind or solar. It focuses on bridging the ongoing oil/condensate storage qualification, adapted into storing ammonia. The large-scale verification test scope is explained, and we show how the test is extended to also prove the concept of safe energy/ammonia storage. The ammonia storage concept is explained, and we show how this can be included as part of a low carbon future. The focus is the immediate market for providing clean power to existing or new offshore assets. The full system solution will encompass storage tanks placed nearby the platforms at safe water depths, riser systems providing fuel to the ammonia power generators, and the tank filling systems. Bridging and adapting technologies from the petroleum industry into renewables shows the importance of utilizing the technology developments and competence of the oil and gas business. The technical evaluations have shown that the oil/condensate storage can be adapted into storing energy/ammonia with minor modifications. Converting hydrogen into ammonia gives slight energy losses, but it is defended by the large economic benefits of storing ammonia versus pressure storage of hydrogen. The paper presents qualification work already completed and how to implement ammonia fuel storage for platforms. In addition, we show the test setup for a large-scale qualification provided by an original equipment manufacturer (OEM) company together with major Operators. Innovative modular design methods have shown that the concept can be included on existing offshore assets, which have limited topside space available. Adding green or blue ammonia as an alternative to power cables from shore have several benefits, and many of the connecting building blocks are falling into place. The main conclusion is how to adapt Novel technologies from the oil industry to store ammonia in a safe way on the seafloor.


Author(s):  
Vinícius Veloso de Melo ◽  
Danilo Vasconcellos Vargas ◽  
Marcio Kassouf Crocomo

This paper presents a new technique for optimizing binary problems with building blocks. The authors have developed a different approach to existing Estimation of Distribution Algorithms (EDAs). Our technique, called Phylogenetic Differential Evolution (PhyDE), combines the Phylogenetic Algorithm and the Differential Evolution Algorithm. The first one is employed to identify the building blocks and to generate metavariables. The second one is used to find the best instance of each metavariable. In contrast to existing EDAs that identify the related variables at each iteration, the presented technique finds the related variables only once at the beginning of the algorithm, and not through the generations. This paper shows that the proposed technique is more efficient than the well known EDA called Extended Compact Genetic Algorithm (ECGA), especially for large-scale systems which are commonly found in real world problems.


Sign in / Sign up

Export Citation Format

Share Document