Maximizing Intervention Effectiveness

2020 ◽  
Vol 66 (12) ◽  
pp. 5576-5598
Author(s):  
Vishal Gupta ◽  
Brian Rongqing Han ◽  
Song-Hee Kim ◽  
Hyung Paek

Frequently, policy makers seek to roll out an intervention previously proven effective in a research study, perhaps subject to resource constraints. However, because different subpopulations may respond differently to the same treatment, there is no a priori guarantee that the intervention will be as effective in the targeted population as it was in the study. How then should policy makers target individuals to maximize intervention effectiveness? We propose a novel robust optimization approach that leverages evidence typically available in a published study. Our model can be easily optimized in minutes for realistic instances with off-the-shelf software and is flexible enough to accommodate a variety of resource and fairness constraints. We compare our approach with current practice by proving performance guarantees for both approaches, which emphasize their structural differences. We also prove an intuitive interpretation of our model in terms of regularization, penalizing differences in the demographic distribution between targeted individuals and the study population. Although the precise penalty depends on the choice of uncertainty set, we show that for special cases we can recover classical penalties from the covariate matching literature on causal inference. Finally, using real data from a large teaching hospital, we compare our approach to common practice in the particular context of reducing emergency department utilization by Medicaid patients through case management. We find that our approach can offer significant benefits over common practice, particularly when the heterogeneity in patient response to the treatment is large. This paper was accepted by Chung-Piaw Teo, optimization.

2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Camilo Broc ◽  
Therese Truong ◽  
Benoit Liquet

Abstract Background The increasing number of genome-wide association studies (GWAS) has revealed several loci that are associated to multiple distinct phenotypes, suggesting the existence of pleiotropic effects. Highlighting these cross-phenotype genetic associations could help to identify and understand common biological mechanisms underlying some diseases. Common approaches test the association between genetic variants and multiple traits at the SNP level. In this paper, we propose a novel gene- and a pathway-level approach in the case where several independent GWAS on independent traits are available. The method is based on a generalization of the sparse group Partial Least Squares (sgPLS) to take into account groups of variables, and a Lasso penalization that links all independent data sets. This method, called joint-sgPLS, is able to convincingly detect signal at the variable level and at the group level. Results Our method has the advantage to propose a global readable model while coping with the architecture of data. It can outperform traditional methods and provides a wider insight in terms of a priori information. We compared the performance of the proposed method to other benchmark methods on simulated data and gave an example of application on real data with the aim to highlight common susceptibility variants to breast and thyroid cancers. Conclusion The joint-sgPLS shows interesting properties for detecting a signal. As an extension of the PLS, the method is suited for data with a large number of variables. The choice of Lasso penalization copes with architectures of groups of variables and observations sets. Furthermore, although the method has been applied to a genetic study, its formulation is adapted to any data with high number of variables and an exposed a priori architecture in other application fields.


2016 ◽  
Vol 2016 ◽  
pp. 1-15
Author(s):  
N. Vanello ◽  
E. Ricciardi ◽  
L. Landini

Independent component analysis (ICA) of functional magnetic resonance imaging (fMRI) data can be employed as an exploratory method. The lack in the ICA model of strong a priori assumptions about the signal or about the noise leads to difficult interpretations of the results. Moreover, the statistical independence of the components is only approximated. Residual dependencies among the components can reveal informative structure in the data. A major problem is related to model order selection, that is, the number of components to be extracted. Specifically, overestimation may lead to component splitting. In this work, a method based on hierarchical clustering of ICA applied to fMRI datasets is investigated. The clustering algorithm uses a metric based on the mutual information between the ICs. To estimate the similarity measure, a histogram-based technique and one based on kernel density estimation are tested on simulated datasets. Simulations results indicate that the method could be used to cluster components related to the same task and resulting from a splitting process occurring at different model orders. Different performances of the similarity measures were found and discussed. Preliminary results on real data are reported and show that the method can group task related and transiently task related components.


2015 ◽  
Vol 2015 ◽  
pp. 1-13
Author(s):  
Jianwei Ding ◽  
Yingbo Liu ◽  
Li Zhang ◽  
Jianmin Wang

Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM), which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis). Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.


1997 ◽  
Vol 49 (2) ◽  
pp. 282-308 ◽  
Author(s):  
A. James Mcadams

Since the fall of the Berlin Wall, the concept of “normalcy” has occupied a prominent place in the pronouncements of Germany's most powerful politicians and policy makers. In addition, it has also suffused much of the emerging literature on the domestic and international implications of German unification. Some observers argue that unification embodies the call to normalcy, offering Germany's leaders the opportunity to put their nation's past behind them. Others treat the events of 1989–90 as part of an ongoing challenge to German identity. Finally, a third group of scholars regards the invocation of German unity as an excuse for papering over the crimes of the Nazi past. Although there is no a priori basis for considering any one of these approaches the most appropriate for assessing contemporary German affairs, this does not mean one's choice of terms is totally arbitrary. If German normalcy is to mean anything analytically, it must minimally represent an attainable and worthy goal to which the leaders of the Federal Republic can aspire in their efforts to make Germany more like other European states.


Geophysics ◽  
2004 ◽  
Vol 69 (4) ◽  
pp. 978-993 ◽  
Author(s):  
Jo Eidsvik ◽  
Per Avseth ◽  
Henning Omre ◽  
Tapan Mukerji ◽  
Gary Mavko

Reservoir characterization must be based on information from various sources. Well observations, seismic reflection times, and seismic amplitude versus offset (AVO) attributes are integrated in this study to predict the distribution of the reservoir variables, i.e., facies and fluid filling. The prediction problem is cast in a Bayesian setting. The a priori model includes spatial coupling through Markov random field assumptions and intervariable dependencies through nonlinear relations based on rock physics theory, including Gassmann's relation. The likelihood model relating observations to reservoir variables (including lithology facies and pore fluids) is based on approximations to Zoeppritz equations. The model assumptions are summarized in a Bayesian network illustrating the dependencies between the reservoir variables. The posterior model for the reservoir variables conditioned on the available observations is defined by the a priori and likelihood models. This posterior model is not analytically tractable but can be explored by Markov chain Monte Carlo (MCMC) sampling. Realizations of reservoir variables from the posterior model are used to predict the facies and fluid‐filling distribution in the reservoir. A maximum a posteriori (MAP) criterion is used in this study to predict facies and pore‐fluid distributions. The realizations are also used to present probability maps for the favorable (sand, oil) occurrence in the reservoir. Finally, the impact of seismic AVO attributes—AVO gradient, in particular—is studied. The approach is demonstrated on real data from a turbidite sedimentary system in the North Sea. AVO attributes on the interface between reservoir and cap rock are extracted from 3D seismic AVO data. The AVO gradient is shown to be valuable in reducing the ambiguity between facies and fluids in the prediction.


Author(s):  
Kay Lehman Schlozman ◽  
Sidney Verba ◽  
Henry E. Brady

This chapter maps the terrain of political activity by organizations using systematic empirical data to reveal something about the political voice emerging from organized involvement in various domains of national politics. For various domains of organizational activity, the chapter characterizes categories of organizations with respect to the likelihood that organizations are active and, if active, how much they do. In the process this chapter clarifies the strategic considerations and resource constraints that shape the involvement of different kinds of organizations in different arenas. Here, it becomes apparent that the policy makers in different institutional settings hear quite different mixes of messages.


Symmetry ◽  
2020 ◽  
Vol 12 (3) ◽  
pp. 464
Author(s):  
Victoriano García ◽  
María Martel-Escobar ◽  
F.J. Vázquez-Polo

This paper presents a three-parameter family of distributions which includes the common exponential and the Marshall–Olkin exponential as special cases. This distribution exhibits a monotone failure rate function, which makes it appealing for practitioners interested in reliability, and means it can be included in the catalogue of appropriate non-symmetric distributions to model these issues, such as the gamma and Weibull three-parameter families. Given the lack of symmetry of this kind of distribution, various statistical and reliability properties of this model are examined. Numerical examples based on real data reflect the suitable behaviour of this distribution for modelling purposes.


2020 ◽  
Vol 12 (18) ◽  
pp. 2923
Author(s):  
Tengfei Zhou ◽  
Xiaojun Cheng ◽  
Peng Lin ◽  
Zhenlun Wu ◽  
Ensheng Liu

Due to the existence of environmental or human factors, and because of the instrument itself, there are many uncertainties in point clouds, which directly affect the data quality and the accuracy of subsequent processing, such as point cloud segmentation, 3D modeling, etc. In this paper, to address this problem, stochastic information of point cloud coordinates is taken into account, and on the basis of the scanner observation principle within the Gauss–Helmert model, a novel general point-based self-calibration method is developed for terrestrial laser scanners, incorporating both five additional parameters and six exterior orientation parameters. For cases where the instrument accuracy is different from the nominal ones, the variance component estimation algorithm is implemented for reweighting the outliers after the residual errors of observations obtained. Considering that the proposed method essentially is a nonlinear model, the Gauss–Newton iteration method is applied to derive the solutions of additional parameters and exterior orientation parameters. We conducted experiments using simulated and real data and compared them with those two existing methods. The experimental results showed that the proposed method could improve the point accuracy from 10−4 to 10−8 (a priori known) and 10−7 (a priori unknown), and reduced the correlation among the parameters (approximately 60% of volume). However, it is undeniable that some correlations increased instead, which is the limitation of the general method.


2019 ◽  
Vol 32 (1) ◽  
pp. 59-70 ◽  
Author(s):  
Yu-Li Huang ◽  
Sarah M. Bach ◽  
Sherry A. Looker

Purpose The purpose of this paper is to develop a chemotherapy scheduling template that accounts for nurse resource availability and patient treatment needs to alleviate the mid-day patient load and provide quality services for patients. Design/methodology/approach Owing to treatment complexity in chemotherapy administration, nurses are required at the beginning, end and during treatment. When nurses are not available to continue treatment, the service is compromised, and the resource constraint is violated, which leads to inevitable delay that risks service quality. Consequently, an optimization method is used to create a scheduling template that minimizes the violation between resource assignment and treatment requirements, while leveling patient load throughout a day. A case study from a typical clinic day is presented to understand current scheduling issues, describe nursing resource constraints, and develop a constraint-based optimization model and leveling algorithm for the final template. Findings The approach is expected to reduce the variation in the system by 24 percent and result in five fewer chemo chairs used during peak hours. Adjusting staffing levels could further reduce resource constraint violations and more savings on chair occupancy. The actual implementation results indicate a 33 percent reduction on resource constraint violations and positive feedback from nursing staff for workload. Research limitations/implications Other delays, including laboratory test, physician visit and treatment assignment, are potential research areas. Originality/value The study demonstrates significant improvement in mid-day patient load and meeting treatment needs using optimization with a unique objective.


Sign in / Sign up

Export Citation Format

Share Document