scholarly journals Adaptive Model Building Framework for Production Planning in the Primary Wood Industry

Forests ◽  
2020 ◽  
Vol 11 (12) ◽  
pp. 1256
Author(s):  
Matthias Kaltenbrunner ◽  
Maria Anna Huka ◽  
Manfred Gronalt

Production planning models for the primary wood industry have been proposed for several decades. However, the majority of the research to date is concentrated on individual cases. This paper presents an integrated adaptive modelling framework that combines the proposed approaches and identifies evolving planning situations. With this conceptual modelling approach, a wide range of planning issues can be addressed by using a solid model basis. A planning grid along the time and resource dimensions is developed and four illustrative and interdependent application cases are described. The respective mathematical programming models are also presented in the paper and the prerequisites for industrial implementation are shown.

2020 ◽  
Vol 25 ◽  
pp. 398-415
Author(s):  
Gabriele Novembri ◽  
Francesco Livio Rossini

The disruptive development of ICT technologies can be a decisive element in bringing the productivity of the construction sector closer to the highest performing ones, such as automotive. We base the proposed research on the improvement of existing tools with artificial intelligence techniques, with the goal of having a self-adaptive model regarding the objectives to achieve. So, we present a general framework based on the Swarm Simulation Modelling approach. Building objects, goals, constraints and design solutions will be represented as a Multi-Agents System able to communicate, interact and integrate over the existing BIM systems, ensuring a reactive and proactive behaviour. Finally, the model presents many interconnected intelligent agents, linked to building object. They can sense external ‘perturbations’ and react by re-organising their structure to satisfy imposed constraints. We can find the near optimal solution via the DCOP approach, autonomously.


Polymers ◽  
2020 ◽  
Vol 12 (10) ◽  
pp. 2237 ◽  
Author(s):  
P. R. Sarika ◽  
Paul Nancarrow ◽  
Abdulrahman Khansaheb ◽  
Taleb Ibrahim

Phenol–formaldehyde (PF) resin continues to dominate the resin industry more than 100 years after its first synthesis. Its versatile properties such as thermal stability, chemical resistance, fire resistance, and dimensional stability make it a suitable material for a wide range of applications. PF resins have been used in the wood industry as adhesives, in paints and coatings, and in the aerospace, construction, and building industries as composites and foams. Currently, petroleum is the key source of raw materials used in manufacturing PF resin. However, increasing environmental pollution and fossil fuel depletion have driven industries to seek sustainable alternatives to petroleum based raw materials. Over the past decade, researchers have replaced phenol and formaldehyde with sustainable materials such as lignin, tannin, cardanol, hydroxymethylfurfural, and glyoxal to produce bio-based PF resin. Several synthesis modifications are currently under investigation towards improving the properties of bio-based phenolic resin. This review discusses recent developments in the synthesis of PF resins, particularly those created from sustainable raw material substitutes, and modifications applied to the synthetic route in order to improve the mechanical properties.


Symmetry ◽  
2021 ◽  
Vol 13 (2) ◽  
pp. 363
Author(s):  
Marina Dolfin ◽  
Leone Leonida ◽  
Eleonora Muzzupappa

This paper adopts the Kinetic Theory for Active Particles (KTAP) approach to model the dynamics of liquidity profiles on a complex adaptive network system that mimic a stylized financial market. Individual incentives of investors to form or delete a link is driven, in our modelling framework, by stochastic game-type interactions modelling the phenomenology related to policy rules implemented under Basel III, and it is exogeneously and dynamically influenced by a measure of overnight interest rate. The strategic network formation dynamics that emerges from the introduced transition probabilities modelling individual incentives of investors to form or delete links, provides a wide range of measures using which networks might be considered “best” from the point of view of the overall welfare of the system. We use the time evolution of the aggregate degree of connectivity to measure the time evolving network efficiency in two different scenarios, suggesting a first analysis of the stability of the arising and evolving network structures.


2021 ◽  
Author(s):  
Kor de Jong ◽  
Marc van Kreveld ◽  
Debabrata Panja ◽  
Oliver Schmitz ◽  
Derek Karssenberg

<p>Data availability at global scale is increasing exponentially. Although considerable challenges remain regarding the identification of model structure and parameters of continental scale hydrological models, we will soon reach the situation that global scale models could be defined at very high resolutions close to 100 m or less. One of the key challenges is how to make simulations of these ultra-high resolution models tractable ([1]).</p><p>Our research contributes by the development of a model building framework that is specifically designed to distribute calculations over multiple cluster nodes. This framework enables domain experts like hydrologists to develop their own large scale models, using a scripting language like Python, without the need to acquire the skills to develop low-level computer code for parallel and distributed computing.</p><p>We present the design and implementation of this software framework and illustrate its use with a prototype 100 m, 1 h continental scale hydrological model. Our modelling framework ensures that any model built with it is parallelized. This is made possible by providing the model builder with a set of building blocks of models, which are coded in such a manner that parallelization of calculations occurs within and across these building blocks, for any combination of building blocks. There is thus full flexibility on the side of the modeller, without losing performance.</p><p>This breakthrough is made possible by applying a novel approach to the implementation of the model building framework, called asynchronous many-tasks, provided by the HPX C++ software library ([3]). The code in the model building framework expresses spatial operations as large collections of interdependent tasks that can be executed efficiently on individual laptops as well as computer clusters ([2]). Our framework currently includes the most essential operations for building large scale hydrological models, including those for simulating transport of material through a flow direction network. By combining these operations, we rebuilt an existing 100 m, 1 h resolution model, thus far used for simulations of small catchments, requiring limited coding as we only had to replace the computational back end of the existing model. Runs at continental scale on a computer cluster show acceptable strong and weak scaling providing a strong indication that global simulations at this resolution will soon be possible, technically speaking.</p><p>Future work will focus on extending the set of modelling operations and adding scalable I/O, after which existing models that are currently limited in their ability to use the computational resources available to them can be ported to this new environment.</p><p>More information about our modelling framework is at https://lue.computationalgeography.org.</p><p><strong>References</strong></p><p>[1] M. Bierkens. Global hydrology 2015: State, trends, and directions. Water Resources Research, 51(7):4923–4947, 2015.<br>[2] K. de Jong, et al. An environmental modelling framework based on asynchronous many-tasks: scalability and usability. Submitted.<br>[3] H. Kaiser, et al. HPX - The C++ standard library for parallelism and concurrency. Journal of Open Source Software, 5(53):2352, 2020.</p>


2007 ◽  
Vol 37 (10) ◽  
pp. 2010-2021 ◽  
Author(s):  
Samuel D. Pittman ◽  
B. Bruce Bare ◽  
David G. Briggs

Forest planning models have increased in size and complexity as planners address a growing array of economic, ecological, and societal issues. Hierarchical production models offer a means of better managing these large and complex models. Hierarchical production planning models decompose large models into a set of smaller linked models. For example, in this paper, a Lagrangian relaxation formulation and a modified Dantzig–Wolfe decomposition – column generation routine are used to solve a hierarchical forest planning model that maximizes the net present value of harvest incomes while recognizing specific geographical units that are subject to harvest flow and green-up constraints. This allows the planning model to consider forest-wide constraints such as harvest flow, as well as address separate subproblems for each contiguous management zone for which detailed spatial plans are computed. The approach taken in this paper is different from past approaches in forest hierarchical planning because we start with a single model and derive a hierarchical model that addresses integer subproblems using Dantzig–Wolfe decomposition. The decomposition approach is demonstrated by analyzing a set of randomly generated planning problems constructed from a large forest and land inventory data set.


2021 ◽  
Vol 21 (1) ◽  
pp. 279-299
Author(s):  
Christoph Welker ◽  
Thomas Röösli ◽  
David N. Bresch

Abstract. With access to claims, insurers have a long tradition of being knowledge leaders on damages caused by windstorms. However, new opportunities have arisen to better assess the risks of winter windstorms in Europe through the availability of historic footprints provided by the Windstorm Information Service (Copernicus WISC). In this study, we compare how modelling of building damages complements claims-based risk assessment. We describe and use two windstorm risk models: an insurer's proprietary model and the open source CLIMADA platform. Both use the historic WISC dataset and a purposefully built, probabilistic hazard event set of winter windstorms across Europe to model building damages in the canton of Zurich, Switzerland. These approaches project a considerably lower estimate for the annual average damage (CHF 1.4 million), compared to claims (CHF 2.3 million), which originates mainly from a different assessment of the return period of the most damaging historic event Lothar–Martin. Additionally, the probabilistic modelling approach allows assessment of rare events, such as a 250-year-return-period windstorm causing CHF 75 million in damages, including an evaluation of the uncertainties. Our study emphasizes the importance of complementing a claims-based perspective with a probabilistic risk modelling approach to better understand windstorm risks. The presented open-source model provides a straightforward entry point for small insurance companies.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Alexander P Browning ◽  
Jesse A Sharp ◽  
Ryan J Murphy ◽  
Gency Gunasingh ◽  
Brodie Lawson ◽  
...  

Tumour spheroids are common in vitro experimental models of avascular tumour growth. Compared with traditional two-dimensional culture, tumour spheroids more closely mimic the avascular tumour microenvironment where spatial differences in nutrient availability strongly influence growth. We show that spheroids initiated using significantly different numbers of cells grow to similar limiting sizes, suggesting that avascular tumours have a limiting structure; in agreement with untested predictions of classical mathematical models of tumour spheroids. We develop a novel mathematical and statistical framework to study the structure of tumour spheroids seeded from cells transduced with fluorescent cell cycle indicators, enabling us to discriminate between arrested and cycling cells and identify an arrested region. Our analysis shows that transient spheroid structure is independent of initial spheroid size, and the limiting structure can be independent of seeding density. Standard experimental protocols compare spheroid size as a function of time; however, our analysis suggests that comparing spheroid structure as a function of overall size produces results that are relatively insensitive to variability in spheroid size. Our experimental observations are made using two melanoma cell lines, but our modelling framework applies across a wide range of spheroid culture conditions and cell lines.


Author(s):  
Po-Ming Lee ◽  
Tzu-Chien Hsiao

Abstract Recent studies have utilizes color, texture, and composition information of images to achieve affective image classification. However, the features related to spatial-frequency domain that were proven to be useful for traditional pattern recognition have not been tested in this field yet. Furthermore, the experiments conducted by previous studies are not internationally-comparable due to the experimental paradigm adopted. In addition, contributed by recent advances in methodology, that are, Hilbert-Huang Transform (HHT) (i.e. Empirical Mode Decomposition (EMD) and Hilbert Transform (HT)), the resolution of frequency analysis has been improved. Hence, the goal of this research is to achieve the affective image-classification task by adopting a standard experimental paradigm introduces by psychologists in order to produce international-comparable and reproducible results; and also to explore the affective hidden patterns of images in the spatial-frequency domain. To accomplish these goals, multiple human-subject experiments were conducted in laboratory. Extended Classifier Systems (XCSs) was used for model building because the XCS has been applied to a wide range of classification tasks and proved to be competitive in pattern recognition. To exploit the information in the spatial-frequency domain, the traditional EMD has been extended to a two-dimensional version. To summarize, the model built by using the XCS achieves Area Under Curve (AUC) = 0.91 and accuracy rate over 86%. The result of the XCS was compared with other traditional machine-learning algorithms (e.g., Radial-Basis Function Network (RBF Network)) that are normally used for classification tasks. Contributed by proper selection of features for model building, user-independent findings were obtained. For example, it is found that the horizontal visual stimulations contribute more to the emotion elicitation than the vertical visual stimulation. The effect of hue, saturation, and brightness; is also presented.


2015 ◽  
Vol 19 (5) ◽  
pp. 2295-2314 ◽  
Author(s):  
P. Hublart ◽  
D. Ruelland ◽  
A. Dezetter ◽  
H. Jourde

Abstract. The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modelling of a mesoscale Andean catchment (1515 km2) over a 30-year period (1982–2011). The modelling process was decomposed into six model-building decisions related to the following aspects of the system behaviour: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modelling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional (4-D) space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain eight model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modelling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.


Energies ◽  
2019 ◽  
Vol 12 (17) ◽  
pp. 3388 ◽  
Author(s):  
Niina Helistö ◽  
Juha Kiviluoma ◽  
Jussi Ikäheimo ◽  
Topi Rasku ◽  
Erkka Rinne ◽  
...  

Backbone represents a highly adaptable energy systems modelling framework, which can be utilised to create models for studying the design and operation of energy systems, both from investment planning and scheduling perspectives. It includes a wide range of features and constraints, such as stochastic parameters, multiple reserve products, energy storage units, controlled and uncontrolled energy transfers, and, most significantly, multiple energy sectors. The formulation is based on mixed-integer programming and takes into account unit commitment decisions for power plants and other energy conversion facilities. Both high-level large-scale systems and fully detailed smaller-scale systems can be appropriately modelled. The framework has been implemented as the open-source Backbone modelling tool using General Algebraic Modeling System (GAMS). An application of the framework is demonstrated using a power system example, and Backbone is shown to produce results comparable to a commercial tool. However, the adaptability of Backbone further enables the creation and solution of energy systems models relatively easily for many different purposes and thus it improves on the available methodologies.


Sign in / Sign up

Export Citation Format

Share Document