scholarly journals Delphi Method Supported by Forecasting Software

Information ◽  
2020 ◽  
Vol 11 (2) ◽  
pp. 65
Author(s):  
Marcin Lawnik ◽  
Arkadiusz Banasik

The Delphi method is one of the basic tools for forecasting values in various types of issues. It uses the knowledge of experts, which is properly aggregated (e.g., in the form of descriptive statistics measures) and returns to the previous group of experts again, thus starting the next round of forecasting. The multi-stage prediction under the Delphi method allows for better stabilization of the results, which is extremely important in the process of forecasting. Experts in the forecasting process often have access to time series forecasting software but do not necessarily use it. Therefore, it seems advisable to add to the aggregate the value obtained using forecasting software. The advantage of this approach is in saving the time and costs of obtaining a forecast. That should be understood as a smaller burden on data analysts and the value of their work. According to the above mentioned key factors, the main contribution of the article is the use of a virtual expert in the form of a computer-enhanced mathematical tool, i.e., a programming library for a forecasting time series. The chosen software tool is the Prophet library—a Facebook tool that can be used in Python or R programming languages.

Author(s):  
David Hankin ◽  
Michael S. Mohr ◽  
Kenneth B. Newman

We present a rigorous but understandable introduction to the field of sampling theory for ecologists and natural resource scientists. Sampling theory concerns itself with development of procedures for random selection of a subset of units, a sample, from a larger finite population, and with how to best use sample data to make scientifically and statistically sound inferences about the population as a whole. The inferences fall into two broad categories: (a) estimation of simple descriptive population parameters, such as means, totals, or proportions, for variables of interest, and (b) estimation of uncertainty associated with estimated parameter values. Although the targets of estimation are few and simple, estimates of means, totals, or proportions see important and often controversial uses in management of natural resources and in fundamental ecological research, but few ecologists or natural resource scientists have formal training in sampling theory. We emphasize the classical design-based approach to sampling in which variable values associated with units are regarded as fixed and uncertainty of estimation arises via various randomization strategies that may be used to select samples. In addition to covering standard topics such as simple random, systematic, cluster, unequal probability (stressing the generality of Horvitz–Thompson estimation), multi-stage, and multi-phase sampling, we also consider adaptive sampling, spatially balanced sampling, and sampling through time, three areas of special importance for ecologists and natural resource scientists. The text is directed to undergraduate seniors, graduate students, and practicing professionals. Problems emphasize application of the theory and R programming in ecological and natural resource settings.


2021 ◽  
Vol 5 (1) ◽  
pp. 46
Author(s):  
Mostafa Abotaleb ◽  
Tatiana Makarovskikh

COVID-19 is one of the biggest challenges that countries face at the present time, as infections and deaths change daily and because this pandemic has a dynamic spread. Our paper considers two tasks. The first one is to develop a system for modeling COVID-19 based on time-series models due to their accuracy in forecasting COVID-19 cases. We developed an “Epidemic. TA” system using R programming for modeling and forecasting COVID-19 cases. This system contains linear (ARIMA and Holt’s model) and non-linear (BATS, TBATS, and SIR) time-series models and neural network auto-regressive models (NNAR), which allows us to obtain the most accurate forecasts of infections, deaths, and vaccination cases. The second task is the implementation of our system to forecast the risk of the third wave of infections in the Russian Federation.


2014 ◽  
Vol 19 (3) ◽  
pp. 309-327 ◽  
Author(s):  
Tijen Demirel-Pegg

This study investigates the dynamics of transition from a peaceful protest wave to a violent insurgency. It examines the causal path leading to a major shift in the intensity of a protest wave and argues that the transition is the product of the interactions between the dissidents, the state, and external actors. By studying the protest wave in Kashmir (1979-88), it identifies state repression and external support as the key factors driving the transition process. Time series analysis is used to analyze the original empirical evidence collected through content analysis. By providing a comprehensive understanding of the origins of the insurgency in Kashmir, this study shows that protest waves and civil wars are intimately linked.


2018 ◽  
Vol 15 (01) ◽  
pp. 85-94
Author(s):  
Santanu Acharjee ◽  
Binod Chandra Tripathy

The forecasting graphs of World Bank, Reserve Bank of India, etc. are mostly line graphs or time series graphs. Any forecasting contains “standard error” as an error with complicated statistical formulae. A keen observation shows that mathematical patterns are available in nature, but in most of the cases, it is difficult for us to recognize these patterns. Similarly, it is most important for us to know the least upper bounds of these line graphs or time series graphs so that peaks of the prices with respect to time will not exceed these least upper bounds. It is hard to find any statistical or mathematical tool to determine these least upper bounds. Thus we give methodology to obtain these least upper bounds. We show existence of an equilibrium between the expected price and the original price of a commodity with the help of local functions and expansion operators of a bitopological space. These methods are based on choice of a consumer. Examples are provided to show that price of a commodity cannot exceed the interval of expected price. Moreover, we try to provide possible answers to the problem of “Control of Economic Variable” of Morgenstern [O. Morgenstern, Thirteen critical points in contemporary economic theory: An interpretation, Journal of Economic Literature 10(4) 1972 1163–1189] by determining least upper bounds.


Author(s):  
Ireneusz Jablonski ◽  
Kamil Subzda ◽  
Janusz Mroczka

In this paper, the authors examine software implementation and the initial preprocessing of data and tools during the assessment of the complexity and variability of long physiological time-series. The algorithms presented advance a bigger Matlab library devoted to complex system and data analysis. Commercial software is unavailable for many of these functions and is generally unsuitable for use with multi-gigabyte datasets. Reliable inter-event time extraction from input signal is an important step for the presented considerations. Knowing the distribution of the inter-event time distances, it is possible to calculate exponents due to power-law scaling. From a methodology point of view, simulations and considerations with experimental data supported each stage of the work presented. In this paper, initial calibration of the procedures with accessible data confirmed assessments made during earlier studies, which raise objectivity of measurements planned in the future.


2010 ◽  
Author(s):  
Francesco Coppi ◽  
Carmelo Gentile ◽  
Pier Paolo Ricci ◽  
E. P. Tomasini
Keyword(s):  

2011 ◽  
Vol 63 (3) ◽  
pp. 369-376 ◽  
Author(s):  
M. Métadier ◽  
J. -L. Bertrand-Krajewski

With the increasing implementation of continuous monitoring of both discharge and water quality in sewer systems, large data bases are now available. In order to manage large amounts of data and calculate various variables and indicators of interest it is necessary to apply automated methods for data processing. This paper deals with the processing of short time step turbidity time series to estimate TSS (Total Suspended Solids) and COD (Chemical Oxygen Demand) event loads in sewer systems during storm events and their associated uncertainties. The following steps are described: (i) sensor calibration, (ii) estimation of data uncertainties, (iii) correction of raw data, (iv) data pre-validation tests, (v) final validation, and (vi) calculation of TSS and COD event loads and estimation of their uncertainties. These steps have been implemented in an integrated software tool. Examples of results are given for a set of 33 storm events monitored in a stormwater separate sewer system.


Sign in / Sign up

Export Citation Format

Share Document