Fixed-Effects Vector Decomposition: Properties, Reliability, and Instruments

2011 ◽  
Vol 19 (2) ◽  
pp. 147-164 ◽  
Author(s):  
Thomas Plümper ◽  
Vera E. Troeger

This article reinforces our 2007 Political Analysis publication in demonstrating that the fixed-effects vector decomposition (FEVD) procedure outperforms any other estimator in estimating models that suffer from the simultaneous presence of time-varying variables correlated with unobserved unit effects and time-invariant variables. We compare the finite-sample properties of FEVD not only to the Hausman-Taylor estimator but also to the pretest estimator and the shrinkage estimator suggested by Breusch, Ward, Nguyen and Kompas (BWNK), and Greene in this symposium. Moreover, we correct the discussion of Greene and BWNK of FEVD's asymptotic and finite-sample properties.

2011 ◽  
Vol 19 (2) ◽  
pp. 123-134 ◽  
Author(s):  
Trevor Breusch ◽  
Michael B. Ward ◽  
Hoa Thi Minh Nguyen ◽  
Tom Kompas

This paper analyzes the properties of the fixed-effects vector decomposition estimator, an emerging and popular technique for estimating time-invariant variables in panel data models with group effects. This estimator was initially motivated on heuristic grounds, and advocated on the strength of favorable Monte Carlo results, but with no formal analysis. We show that the three-stage procedure of this decomposition is equivalent to a standard instrumental variables approach, for a specific set of instruments. The instrumental variables representation facilitates the present formal analysis that finds: (1) The estimator reproduces exactly classical fixed-effects estimates for time-varying variables. (2) The standard errors recommended for this estimator are too small for both time-varying and time-invariant variables. (3) The estimator is inconsistent when the time-invariant variables are endogenous. (4) The reported sampling properties in the original Monte Carlo evidence do not account for presence of a group effect. (5) The decomposition estimator has higher risk than existing shrinkage approaches, unless the endogeneity problem is known to be small or no relevant instruments exist.


2007 ◽  
Vol 15 (2) ◽  
pp. 124-139 ◽  
Author(s):  
Thomas Plümper ◽  
Vera E. Troeger

This paper suggests a three-stage procedure for the estimation of time-invariant and rarely changing variables in panel data models with unit effects. The first stage of the proposed estimator runs a fixed-effects model to obtain the unit effects, the second stage breaks down the unit effects into a part explained by the time-invariant and/or rarely changing variables and an error term, and the third stage reestimates the first stage by pooled OLS (with or without autocorrelation correction and with or without panel-corrected SEs) including the time-invariant variables plus the error term of stage 2, which then accounts for the unexplained part of the unit effects. We use Monte Carlo simulations to compare the finite sample properties of our estimator to the finite sample properties of competing estimators. In doing so, we demonstrate that our proposed technique provides the most reliable estimates under a wide variety of specifications common to real world data.


2011 ◽  
Vol 19 (2) ◽  
pp. 135-146 ◽  
Author(s):  
William Greene

Plümper and Troeger (2007) propose a three-step procedure for the estimation of a fixed effects (FE) model that, it is claimed, “provides the most reliable estimates under a wide variety of specifications common to real world data.” Their fixed effects vector decomposition (FEVD) estimator is startlingly simple, involving three simple steps, each requiring nothing more than ordinary least squares (OLS). Large gains in efficiency are claimed for cases of time-invariant and slowly time-varying regressors. A subsequent literature has compared the estimator to other estimators of FE models, including the estimator of Hausman and Taylor (1981) also (apparently) with impressive gains in efficiency. The article also claims to provide an efficient estimator for parameters on time-invariant variables (TIVs) in the FE model. None of the claims are correct. The FEVD estimator simply reproduces (identically) the linear FE (dummy variable) estimator then substitutes an inappropriate covariance matrix for the correct one. The consistency result follows from the fact that OLS in the FE model is consistent. The “efficiency” gains are illusory. The claim that the estimator provides an estimator for the coefficients on TIVs in an FE model is also incorrect. That part of the parameter vector remains unidentified. The “estimator” relies upon a strong assumption that turns the FE model into a type of random effects model.


2014 ◽  
Vol 20 (4) ◽  
pp. 585-597 ◽  
Author(s):  
Ximena Dueñas ◽  
Paola Palacios ◽  
Blanca Zuluaga

AbstractThis document explores the expulsion and reception determinants of displaced people among Colombian municipalities. For this purpose, we use fixed effects panel data estimations for the period 2004–2009, with municipality year as the unit of analysis. To the best of our knowledge, this is the first paper in Colombia that focuses on reception and the first one using panel data at municipal level to explain expulsion and reception. We find that, contrary to what one may expect, some independent variables affect both expulsion and reception of displaced people in the same direction; for instance, municipalities where homicide rates and conflict intensity are high, are associated with both higher reception and expulsion rates. In addition to the conventional panel data estimation, we also run a fixed effect vector decomposition to identify the explicit effects of certain time-invariant variables.


2018 ◽  
Vol 27 (1) ◽  
pp. 21-45 ◽  
Author(s):  
Thomas Plümper ◽  
Vera E. Troeger

The fixed-effects estimator is biased in the presence of dynamic misspecification and omitted within variation correlated with one of the regressors. We argue and demonstrate that fixed-effects estimates can amplify the bias from dynamic misspecification and that with omitted time-invariant variables and dynamic misspecifications, the fixed-effects estimator can be more biased than the ‘naïve’ OLS model. We also demonstrate that the Hausman test does not reliably identify the least biased estimator when time-invariant and time-varying omitted variables or dynamic misspecifications exist. Accordingly, empirical researchers are ill-advised to rely on the Hausman test for model selection or use the fixed-effects model as default unless they can convincingly justify the assumption of correctly specified dynamics. Our findings caution applied researchers to not overlook the potential drawbacks of relying on the fixed-effects estimator as a default. The results presented here also call upon methodologists to study the properties of estimators in the presence of multiple model misspecifications. Our results suggest that scholars ought to devote much more attention to modeling dynamics appropriately instead of relying on a default solution before they control for potentially omitted variables with constant effects using a fixed-effects specification.


2017 ◽  
Vol 47 (1) ◽  
pp. 182-211 ◽  
Author(s):  
Arvid Sjölander

A popular way to reduce confounding in observational studies is to use each study participant as his or her own control. This is possible when both the exposure and the outcome are time varying and have been measured at several time points for each individual. The case-time-control method is a special case, which, under certain assumptions, allows the analyst to control for confounding by time-varying covariates, while controlling for all time-stationary characteristics of the study participants. There are two formulations of the case-time-control method. One formulation requires that the exposure be binary, and the other requires that there be no more than two time points per individual. In this article the author proposes a generalization of the case-time-control method for nonbinary exposures and an arbitrary number of time points. The author derives the asymptotic properties of the resulting estimator and assesses its finite sample properties in a simulation study.


2001 ◽  
Vol 9 (4) ◽  
pp. 379-384 ◽  
Author(s):  
Ethan Katz

Fixed-effects logit models can be useful in panel data analysis, when N units have been observed for T time periods. There are two main estimators for such models: unconditional maximum likelihood and conditional maximum likelihood. Judged on asymptotic properties, the conditional estimator is superior. However, the unconditional estimator holds several practical advantages, and therefore I sought to determine whether its use could be justified on the basis of finite-sample properties. In a series of Monte Carlo experiments for T < 20, I found a negligible amount of bias in both estimators when T ≥ 16, suggesting that a researcher can safely use either estimator under such conditions. When T < 16, the conditional estimator continued to have a very small amount of bias, but the unconditional estimator developed more bias as T decreased.


2016 ◽  
Vol 33 (4) ◽  
pp. 791-838 ◽  
Author(s):  
Ulrich Hounyo ◽  
Sílvia Gonçalves ◽  
Nour Meddahi

The main contribution of this paper is to propose a bootstrap method for inference on integrated volatility based on the pre-averaging approach, where the pre-averaging is done over all possible overlapping blocks of consecutive observations. The overlapping nature of the pre-averaged returns implies that the leading martingale part in the pre-averaged returns arekn-dependent withkngrowing slowly with the sample sizen. This motivates the application of a blockwise bootstrap method. We show that the “blocks of blocks” bootstrap method is not valid when volatility is time-varying. The failure of the blocks of blocks bootstrap is due to the heterogeneity of the squared pre-averaged returns when volatility is stochastic. To preserve both the dependence and the heterogeneity of squared pre-averaged returns, we propose a novel procedure that combines the wild bootstrap with the blocks of blocks bootstrap. We provide a proof of the first order asymptotic validity of this method for percentile and percentile-tintervals. Our Monte Carlo simulations show that the wild blocks of blocks bootstrap improves the finite sample properties of the existing first order asymptotic theory. We use empirical work to illustrate its use in practice.


2011 ◽  
Vol 19 (2) ◽  
pp. 119-122 ◽  
Author(s):  
Nathaniel Beck

What follows is a longish controversy (two critiques, a reply and two rejoinders) over the quality of the estimates and associated SEs provided by Plümper and Troeger's (2007) “fixed-effect vector decomposition” (FEVD) procedure; Plümper and Troeger (PT) will refer to that article and not any persons. My role is to lay out some issues that separate the authors rather than to adjudicate between them. As with many controversies, a bit of heat is generated along with some light. Readers care a bit less than the authors about what was said when, but they do care a lot about what appropriate method to use when a panel data model has both unit-specific intercepts and variables that are invariant over a unit. Thus, I also take it upon myself to discuss some things that I gleaned from this controversy; this discussion has a bit less heat than what follows, but of course readers should judge the evidence for themselves.


Econometrics ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 29
Author(s):  
Emanuela Ciapanna ◽  
Marco Taboga

This paper deals with instability in regression coefficients. We propose a Bayesian regression model with time-varying coefficients (TVC) that allows to jointly estimate the degree of instability and the time-path of the coefficients. Thanks to the computational tractability of the model and to the fact that it is fully automatic, we are able to run Monte Carlo experiments and analyze its finite-sample properties. We find that the estimation precision and the forecasting accuracy of the TVC model compare favorably to those of other methods commonly employed to deal with parameter instability. A distinguishing feature of the TVC model is its robustness to mis-specification: Its performance is also satisfactory when regression coefficients are stable or when they experience discrete structural breaks. As a demonstrative application, we used our TVC model to estimate the exposures of S&P 500 stocks to market-wide risk factors: We found that a vast majority of stocks had time-varying exposures and the TVC model helped to better forecast these exposures.


Sign in / Sign up

Export Citation Format

Share Document