weak identification
Recently Published Documents


TOTAL DOCUMENTS

64
(FIVE YEARS 3)

H-INDEX

15
(FIVE YEARS 0)

Author(s):  
Bertille Antoine ◽  
Eric Renault
Keyword(s):  

Author(s):  
Sebastian Roché ◽  
Omer Bilen ◽  
Sandrine Astor

The study of the profiles of young adults involved in attacks and bombings in 2015 and 2016 in France highlighted a violent rejection of Western lifestyle and national identification. The question arises whether conflicting religious beliefs (religion hypothesis) and delinquent subculture (rebel-without-a-cause hypothesis) characterize a handful of violent attackers only or, rather, reflect social divides in the general youth population. We propose, based on literature, that there are known two features of a pre-radicalization stage: rejection of national community and justification of political violence. We intend to focus on what explains them in France. For that purpose, we use a large representative sample (n = 9.700) of adolescents, and structural equation modeling. Overall, our findings suggest that pre-radicalization reflects larger societal cleavages. Weak identification with the national community in France appears mainly driven by religious identity, and not religious fundamentalism. Justification of violence against outgroups/agents enforcing order is not predicted by religion, neither as belief system nor as identity. The sources of legitimation of violence are mainly found in espousing a delinquent subculture, and repeat exposure to state violence in the form of pretextual police stops.


Econometrica ◽  
2021 ◽  
Vol 89 (2) ◽  
pp. 733-763
Author(s):  
Tetsuya Kaji

We provide general formulation of weak identification in semiparametric models and an efficiency concept. Weak identification occurs when a parameter is weakly regular, that is, when it is locally homogeneous of degree zero. When this happens, consistent or equivariant estimation is shown to be impossible. We then show that there exists an underlying regular parameter that fully characterizes the weakly regular parameter. While this parameter is not unique, concepts of sufficiency and minimality help pin down a desirable one. If estimation of minimal sufficient underlying parameters is inefficient, it introduces noise in the corresponding estimation of weakly regular parameters, whence we can improve the estimators by local asymptotic Rao–Blackwellization. We call an estimator weakly efficient if it does not admit such improvement. New weakly efficient estimators are presented in linear IV and nonlinear regression models. Simulation of a linear IV model demonstrates how 2SLS and optimal IV estimators are improved.


2020 ◽  
Vol 218 (1) ◽  
pp. 140-177
Author(s):  
Julián Martínez-Iriarte ◽  
Yixiao Sun ◽  
Xuexin Wang
Keyword(s):  

2020 ◽  
pp. 1-45
Author(s):  
Daniel J. Lewis

Identification via heteroskedasticity exploits variance changes between regimes to identify parameters in simultaneous equations. Weak identification occurs when shock variances change very little or multiple variances change close-toproportionally, making standard inference unreliable. I propose an F-test for weak identification in a common simple version of the model. More generally, I establish conditions for validity of non-conservative robust inference on subsets of the parameters, which can be used to test for weak identification. I study monetary policy shocks identified using heteroskedasticity in high frequency data. I detect weak identification, invalidating standard inference, in daily data, while intraday data provides strong identification.


Author(s):  
Lily Y Liu

Abstract Existing reduced-form default intensity models that jointly estimate probability of default (PD) and loss given default (LGD) from credit default swaps (CDSs) produce dissimilar results, and there is little guidance on which time series specification to choose. This article develops a model of CDS term structure without parametric time series restrictions for PD and uses weak-identification robust methods to investigate whether separate identification of PD and LGD is still possible. Consistent with intuition about the identification strategy, the model is not globally identified. However, in my empirical application, LGD is precisely estimated for half of the firm-months under study, with resulting values much lower than conventional values. This implies that the risk-neutral PD and the risk premia on PD are underestimated when LGD is set to conventional values.


2020 ◽  
pp. 1-55
Author(s):  
Jonathan B. Hill

We present a new robust bootstrap method for a test when there is a nuisance parameter under the alternative, and some parameters are possibly weakly or nonidentified. We focus on a Bierens (1990, Econometrica 58, 1443–1458)-type conditional moment test of omitted nonlinearity for convenience. Existing methods include the supremum p-value which promotes a conservative test that is generally not consistent, and test statistic transforms like the supremum and average for which bootstrap methods are not valid under weak identification. We propose a new wild bootstrap method for p-value computation by targeting specific identification cases. We then combine bootstrapped p-values across polar identification cases to form an asymptotically valid p-value approximation that is robust to any identification case. Our wild bootstrap procedure does not require knowledge of the covariance structure of the bootstrapped processes, whereas Andrews and Cheng’s (2012a, Econometrica 80, 2153–2211; 2013, Journal of Econometrics 173, 36–56; 2014, Econometric Theory 30, 287–333) simulation approach generally does. Our method allows for robust bootstrap critical value computation as well. Our bootstrap method (like conventional ones) does not lead to a consistent p-value approximation for test statistic functions like the supremum and average. Therefore, we smooth over the robust bootstrapped p-value as the basis for several tests which achieve the correct asymptotic level, and are consistent, for any degree of identification. They also achieve uniform size control. A simulation study reveals possibly large empirical size distortions in nonrobust tests when weak or nonidentification arises. One of our smoothed p-value tests, however, dominates all other tests by delivering accurate empirical size and comparatively high power.


2020 ◽  
Vol 25 (3) ◽  
pp. 477-494
Author(s):  
Mona Agerholm Andersen

PurposeThe aim of this article is to explore how the employees of a Danish family-owned company identify with the heritage identity of their company. More specifically, the purpose is to study how the employees interpret certain historical events and values in their efforts to make sense of which heritage identity traits have remained meaningful for them over the passage of time and what these historical events and traits mean to their identification with the company.Design/methodology/approachThe investigation is based on 19 in-depth interviews with employees. A critical discourse analysis approach is adopted to uncover the discursive dynamics appearing across the employees' interpretations of historical events and values.FindingsThe study indicates that heritage identity represent a complex and dynamic resource for employees' organizational identification. Therefore, this article argues that it could be a challenge for management to maintain a stable and enduring heritage identity, because the employees' interpretations and consequently their organizational identification is subject to continual revision and under influence by a dynamic and constantly changing social context.Research limitations/implicationsThe findings of this study is limited to the specific context of one company. Further research could investigate the same topics when interviewing employees across the national borders of a global family company in times of change.Practical implicationsManagement need to identify whether different generations of employees develop a strong or weak identification with certain heritage identity traits and whether there are competing or compatible targets of heritage identification among these generations.Originality/valueThis study illuminates the potential challenges related to the maintenance and preservation of heritage identity in a company with roots to a strong founding family, which operates in a constantly changing environment.


2020 ◽  
pp. 1-45
Author(s):  
Ajay Shenoy

Behind many production function estimators lies a crucial assumption that the firm's choice of intermediate inputs depends only on observed choices of other inputs and on unobserved productivity. This assumption fails when market frictions distort the firm's input choices. I derive a test for the assumption, which is rejected in several industries. I show, using weak identification asymptotics, that when the assumption fails a simplified dynamic panel estimator can be used instead of choice-based methods because it requires choices to be distorted. I propose criteria for choosing between estimators, which in simulations yields lower error than either estimator alone.


Sign in / Sign up

Export Citation Format

Share Document