scholarly journals Translating surveillance data into incidence estimates

2018 ◽  
Author(s):  
Yoann Bourhis ◽  
Timothy R. Gottwald ◽  
Frank van den Bosch

AbstractMonitoring a population for a disease requires the hosts to be sampled and tested for the pathogen. This results in sampling series from which to estimate the disease incidence,i.e. the proportion of hosts infected. Existing estimation methods assume that disease incidence is not changing between monitoring rounds, resulting in underestimation of the disease incidence. In this paper we develop an incidence estimation model accounting for epidemic growth with monitoring rounds sampling varying incidence. We also show how to accommodate the asymptomatic period characteristic to most diseases. For practical use, we produce an approximation of the model, which is subsequently shown accurate for relevant epidemic and sampling parameters. Both the approximation and the full model are applied to stochastic spatial simulations of epidemics. The results prove their consistency for a very wide range of situations.

2019 ◽  
Vol 374 (1776) ◽  
pp. 20180262 ◽  
Author(s):  
Y. Bourhis ◽  
T. Gottwald ◽  
F. van den Bosch

Monitoring a population for a disease requires the hosts to be sampled and tested for the pathogen. This results in sampling series from which we may estimate the disease incidence, i.e. the proportion of hosts infected. Existing estimation methods assume that disease incidence does not change between monitoring rounds, resulting in an underestimation of the disease incidence. In this paper, we develop an incidence estimation model accounting for epidemic growth with monitoring rounds that sample varying incidence. We also show how to accommodate the asymptomatic period that is the characteristic of most diseases. For practical use, we produce an approximation of the model, which is subsequently shown to be accurate for relevant epidemic and sampling parameters. Both the approximation and the full model are applied to stochastic spatial simulations of epidemics. The results prove their consistency for a very wide range of situations. The estimation model is made available as an online application. This article is part of the theme issue ‘Modelling infectious disease outbreaks in humans, animals and plants: epidemic forecasting and control’. This theme issue is linked with the earlier issue ‘Modelling infectious disease outbreaks in humans, animals and plants: approaches and important themes’.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 854
Author(s):  
Nevena Rankovic ◽  
Dragica Rankovic ◽  
Mirjana Ivanovic ◽  
Ljubomir Lazic

Software estimation involves meeting a huge number of different requirements, such as resource allocation, cost estimation, effort estimation, time estimation, and the changing demands of software product customers. Numerous estimation models try to solve these problems. In our experiment, a clustering method of input values to mitigate the heterogeneous nature of selected projects was used. Additionally, homogeneity of the data was achieved with the fuzzification method, and we proposed two different activation functions inside a hidden layer, during the construction of artificial neural networks (ANNs). In this research, we present an experiment that uses two different architectures of ANNs, based on Taguchi’s orthogonal vector plans, to satisfy the set conditions, with additional methods and criteria for validation of the proposed model, in this approach. The aim of this paper is the comparative analysis of the obtained results of mean magnitude relative error (MMRE) values. At the same time, our goal is also to find a relatively simple architecture that minimizes the error value while covering a wide range of different software projects. For this purpose, six different datasets are divided into four chosen clusters. The obtained results show that the estimation of diverse projects by dividing them into clusters can contribute to an efficient, reliable, and accurate software product assessment. The contribution of this paper is in the discovered solution that enables the execution of a small number of iterations, which reduces the execution time and achieves the minimum error.


2019 ◽  
Vol 18 (1) ◽  
pp. 1-33
Author(s):  
Fumitoshi Mizutani

Abstract The main purpose of this study is to evaluate factors affecting passenger rail demand, with special attention to the effects of structural reform/regulation and competition. In order to do this, we use data obtained from 30 OECD countries for the 24 years from 1990 to 2013. As structural reform/regulation and competition variables, we take the OECD’s five kinds of regulatory indices: (i) overall, (ii) entry, (iii) public ownership, (iv) vertical integration, and (v) market structure; and for competition variables, we take (vi) rail passenger-freight ratio, (vii) rail share, and (viii) high-speed train ratio. As estimation methods, both the fixed effect model and the Hausman-Taylor estimation model are used. The major findings are as follows. First, competition as competitiveness (i.e. the share of rail, passenger over freight ratio) increases passenger demand. And the existence of high-speed trains increases passenger demand. Second, overall, entry regulation, and market structure have no significant effect on demand. Third, public ownership affects passenger demand positively. Last, vertical integration reduces passenger demand.


1982 ◽  
Vol 14 (10) ◽  
pp. 1341-1354 ◽  
Author(s):  
K E Haynes ◽  
F Y Phillips

Mathematical programming and statistical inference are combined in a constrained minimum discrimination information (MDI) method to provide a basis for a wide range of spatial and individual choice behavior problems. This approach offers an alternative to linear and loglinear regression estimation methods as well as probabilistic models of the logit and probit variety. Some logical and computational difficulties inherent in these approaches are resolved. Further, the approach leads endogenously to alternative hypotheses if the null hypothesis is rejected, and hence has implications for the interaction between research that is oriented toward theory construction and applied research that is empirically oriented.


Author(s):  
Nicolas Greige ◽  
Bryce Liu ◽  
David Nash ◽  
Katie E. Weichman ◽  
Joseph A. Ricci

Abstract Background Accurate flap weight estimation is crucial for preoperative planning in microsurgical breast reconstruction; however, current flap weight estimation methods are time consuming. It was our objective to develop a parsimonious and accurate formula for the estimation of abdominal-based free flap weight. Methods Patients who underwent hemi-abdominal-based free tissue transfer for breast reconstruction at a single institution were retrospectively reviewed. Subcutaneous tissue thicknesses were measured on axial computed tomography angiograms at several predetermined points. Multivariable linear regression was used to generate the parsimonious flap weight estimation model. Split-sample validation was used to for internal validation. Results A total of 132 patients (196 flaps) were analyzed, with a mean body mass index of 31.2 ± 4.0 kg/m2 (range: 22.6–40.7). The mean intraoperative flap weight was 990 ± 344 g (range: 368–2,808). The full predictive model (R 2 = 0.68) estimated flap weight using the Eq. 91.3x + 36.4y + 6.2z – 1030.0, where x is subcutaneous tissue thickness (cm) 5 cm lateral to midline at the level of the anterior superior iliac spine (ASIS), y is distance (cm) between the skin overlying each ASIS, and z is patient weight (kg). Two-thirds split-sample validation was performed using 131 flaps to build a model and the remaining 65 flaps for validation. Upon validation, we observed a median percent error of 10.2% (interquartile range [IQR]: 4.5–18.5) and a median absolute error of 108.6 g (IQR: 45.9–170.7). Conclusion We developed and internally validated a simple and accurate formula for the preoperative estimation of hemi-abdominal-based free flap weight for breast reconstruction.


Author(s):  
Aravindhan K

Cost estimation of software projects is risky task in project management field. It is a process of predicting the cost and effort required to develop a software applications. Several cost estimation models have been proposed over the last thirty to forty years. Many software companies track and analyse the current project by measuring the planed cost and estimate the accuracy. If the estimation is not proper then it leads to the failure of the project. One of the challenging tasks in project management is how to evaluate the different cost estimation and selecting the proper model for the current project. This paper summarizes the different cost estimation model and its techniques. It also provides the proper model selection for the different types of the projects.


2018 ◽  
Author(s):  
Fabien Maussion ◽  
Anton Butenko ◽  
Julia Eis ◽  
Kévin Fourteau ◽  
Alexander H. Jarosch ◽  
...  

Abstract. Despite of their importance for sea-level rise, seasonal water availability, and as source of geohazards, mountain glaciers are one of the few remaining sub-systems of the global climate system for which no globally applicable, open source, community-driven model exists. Here we present the Open Global Glacier Model (OGGM, http://www.oggm.org), developed to provide a modular and open source numerical model framework for simulating past and future change of any glacier in the world. The modelling chain comprises data downloading tools (glacier outlines, topography, climate, validation data), a preprocessing module, a mass-balance model, a distributed ice thickness estimation model, and an ice flow model. The monthly mass-balance is obtained from gridded climate data and a temperature index melt model. To our knowledge, OGGM is the first global model explicitly simulating glacier dynamics: the model relies on the shallow ice approximation to compute the depth-integrated flux of ice along multiple connected flowlines. In this paper, we describe and illustrate each processing step by applying the model to a selection of glaciers before running global simulations under idealized climate forcings. Even without an in-depth calibration, the model shows a very realistic behaviour. We are able to reproduce earlier estimates of global glacier volume by varying the ice dynamical parameters within a range of plausible values. At the same time, the increased complexity of OGGM compared to other prevalent global glacier models comes at a reasonable computational cost: several dozens of glaciers can be simulated on a personal computer, while global simulations realized in a supercomputing environment take up to a few hours per century. Thanks to the modular framework, modules of various complexity can be added to the codebase, allowing to run new kinds of model intercomparisons in a controlled environment. Future developments will add new physical processes to the model as well as tools to calibrate the model in a more comprehensive way. OGGM spans a wide range of applications, from ice-climate interaction studies at millenial time scales to estimates of the contribution of glaciers to past and future sea-level change. It has the potential to become a self-sustained, community driven model for global and regional glacier evolution.


Author(s):  
О.Г. ПОНОМАРЕВ ◽  
М. АСАФ

Рассмотрена проблема коррекции искажений OFDM-сигнала, вызванных смещением частоты дискретизации сигнала в приемном и передающем устройствах системы сотовой связи пятого поколения. Предлагаемый метод компенсации смещения частоты дискретизации основывается на прямой коррекции искажений, вносимых в передаваемый сигнал наличием смещения, и не предполагает какой-либо оценки величины смещения. Метод предназначен для коррекции сигналов в восходящем канале системы сотовой связи пятого поколения и основывается на использовании референсных сигналов, рекомендованных стандартами 3GPP. Результаты численного моделирования показали, что использование предлагаемого метода позволяет повысить эффективность передачи данных по многолучевому радиоканалу более чем на 15% в широком диапазоне значений отношения сигнал/шум. 5G-NR, CP-OFDM, synchronization, sample clock offset, PUSCH. О The paper investigates the issue of sampling clock offset ( SCO) in the fifth generation new radio systems. Due to the imperfect SCO estimation methods, the correction methods relying on the SCO estimation are not perfect, so the proposed method directly corrects the effect of SCO without using any kind of estimation method. Our method is designed to correct the signals in the physical uplink shared channel (PUSCH). The method uses reference signals as recommended by the 3rd generation partnership project (3GPP) standards. The results of the numerical simulation show that the use of the proposed method increases the efficiency of data transmission over the multipath radio channel by more than 15% in a wide range of signal-to-noise ratio values.


2019 ◽  
Vol 12 (3) ◽  
pp. 909-931 ◽  
Author(s):  
Fabien Maussion ◽  
Anton Butenko ◽  
Nicolas Champollion ◽  
Matthias Dusch ◽  
Julia Eis ◽  
...  

Abstract. Despite their importance for sea-level rise, seasonal water availability, and as a source of geohazards, mountain glaciers are one of the few remaining subsystems of the global climate system for which no globally applicable, open source, community-driven model exists. Here we present the Open Global Glacier Model (OGGM), developed to provide a modular and open-source numerical model framework for simulating past and future change of any glacier in the world. The modeling chain comprises data downloading tools (glacier outlines, topography, climate, validation data), a preprocessing module, a mass-balance model, a distributed ice thickness estimation model, and an ice-flow model. The monthly mass balance is obtained from gridded climate data and a temperature index melt model. To our knowledge, OGGM is the first global model to explicitly simulate glacier dynamics: the model relies on the shallow-ice approximation to compute the depth-integrated flux of ice along multiple connected flow lines. In this paper, we describe and illustrate each processing step by applying the model to a selection of glaciers before running global simulations under idealized climate forcings. Even without an in-depth calibration, the model shows very realistic behavior. We are able to reproduce earlier estimates of global glacier volume by varying the ice dynamical parameters within a range of plausible values. At the same time, the increased complexity of OGGM compared to other prevalent global glacier models comes at a reasonable computational cost: several dozen glaciers can be simulated on a personal computer, whereas global simulations realized in a supercomputing environment take up to a few hours per century. Thanks to the modular framework, modules of various complexity can be added to the code base, which allows for new kinds of model intercomparison studies in a controlled environment. Future developments will add new physical processes to the model as well as automated calibration tools. Extensions or alternative parameterizations can be easily added by the community thanks to comprehensive documentation. OGGM spans a wide range of applications, from ice–climate interaction studies at millennial timescales to estimates of the contribution of glaciers to past and future sea-level change. It has the potential to become a self-sustained community-driven model for global and regional glacier evolution.


Author(s):  
Jun Liu ◽  
Rui Zhang ◽  
Shihao Hou

Perceiving the distance between vehicles is a crucial issue for advanced driving assistance systems. However, most vision-based distance estimation methods do not consider the influence of the change in camera attitude angles during driving or only use the vanishing point detected by lane lines to correct the pitch angle. This paper proposed an improved pinhole distance estimation model based on the road vanishing point without the lane line information. First, the road vanishing point is detected based on the dominant texture orientation, and the yaw and pitch angles of the camera are estimated. Then, a distance estimation model considering attitude angle compensation is established. Finally, the experimental results show that the proposed method can effectively correct the influence of the camera attitude angle on the distance estimation results.


Sign in / Sign up

Export Citation Format

Share Document