Download Free Estimation And Inference With Weak Instruments And Near Exogeneity Book in PDF and EPUB Free Download. You can read online Estimation And Inference With Weak Instruments And Near Exogeneity and write the review.

This book covers important topics in econometrics. It discusses methods for efficient estimation in models defined by unconditional and conditional moment restrictions, inference in misspecified models, generalized empirical likelihood estimators, and alternative asymptotic approximations. The first chapter provides a general overview of established nonparametric and parametric approaches to estimation and conventional frameworks for statistical inference. The next several chapters focus on the estimation of models based on moment restrictions implied by economic theory. The final chapters cover nonconventional asymptotic tools that lead to improved finite-sample inference.
Inference in the Presence of Weak Instruments is concerned with inference in the linear simultaneous equations model. The ideas developed for this model have remained central to econometric practice, with the use of instrumental variables estimation having served as a unifying paradigm in econometrics for decades. The literature could be viewed as belonging to one of two strands, either large-sample asymptotic or finite-sample analysis. Of these two strands, the former matured more quickly and has had far greater impact on empirical practice than the latter. In contrast, the finite-sample literature took some twenty years longer to develop, by which time empirical practice was largely entrenched. The consensus view was that the asymptotic results are considerably simpler to interpret than the exact results that are obtained, and are notionally more general as they are predicated on weaker distributional assumptions. Towards the end of the 1980s, both strands of the literature focused attention on models that were either unidentified or close to unidentified. First, there was a growing understanding of the empirical consequences of using weak instruments. Second, the finite-sample results developed throughout the 1980s invariably involved multiple infinite series of invariant polynomials of matrix argument which were typically not very revealing. Consequently, simplifying special cases were explored to illustrate the results contained within the more general expressions. It was observed that the leading terms of these series expansions corresponded to totally unidentified models, and therefore the analyses of these models became a commonly used expository device in this literature. These totally unidentified models can be thought of as limiting cases of weak instruments. Finally, it was becoming clear that the existing large-sample asymptotic results were providing very poor approximations to the true sampling behavior of various statistical procedures. More recently, the literature has been devoted to analyzing potential remedies to the problem of weak instruments Inference in the Presence of Weak Instruments presents a selected survey that examines this growing literature into issues of estimation, hypothesis testing, and confidence interval construction. This survey indicates some of the links between the different traditions by using the small concentration results from an earlier publication of the authors. These results can be used to characterize various special cases when instruments are weak.
This 2005 volume contains the papers presented in honor of the lifelong achievements of Thomas J. Rothenberg on the occasion of his retirement. The authors of the chapters include many of the leading econometricians of our day, and the chapters address topics of current research significance in econometric theory. The chapters cover four themes: identification and efficient estimation in econometrics, asymptotic approximations to the distributions of econometric estimators and tests, inference involving potentially nonstationary time series, such as processes that might have a unit autoregressive root, and nonparametric and semiparametric inference. Several of the chapters provide overviews and treatments of basic conceptual issues, while others advance our understanding of the properties of existing econometric procedures and/or propose others. Specific topics include identification in nonlinear models, inference with weak instruments, tests for nonstationary in time series and panel data, generalized empirical likelihood estimation, and the bootstrap.
This paper analyzes near exogeneity and weak identification in Generalized Empirical Likelihood Estimators. Near exogeneity and weak identification are related to the exogeneity and relevance of the instruments, respectively. These two issues are important from an applied perspective, such as empirical growth theory and labor economics. In the case of empirical growth and institutional economics literature a small number of moments/instruments are used in studies. First, we analyze the limit behavior of estimators and tests under fixed number of weak moments and near exogeneity. We show that Anderson-Rubin (1949) and Kleibergen (2002) type of tests' limits change when there is small correlation between the instruments and the structural equation error. The new limits are obtained under the null hypothesis at the true vale of the parameter. The test statistics are no longer asymptotically pivotal in the joint case of near exogeneity and weak instruments compared to the weak identification case. We also show that when used with the x2 critical values, which are not valid in the case of near exogeneity and weak instruments, the tests show very large size distortions. This is an important warning to applied researchers who may use these tests without taking into account the near exogeneity problem. We try subsampling and delete-d jackknife methods to recover asymptotic limits. Both of these methods are inconsistent. However, we show that the asymptotic limit of delete-d jackknife is arbitrarily close to true limit and only slightly liberal. In simulations, exponential tilting based tests with delete-d jackknife method have good size compared to the others. Then we develop the limits of estimators and tests under many weak moments with near exogeneity. The results are different from the fixed moments case. Estimators are consistent, and test limits are simple, noncentral x2.
This book examines the consequences of misspecifications for the interpretation of likelihood-based methods of statistical estimation and interference. The analysis concludes with an examination of methods by which the possibility of misspecification can be empirically investigated.
The first chapter examines a linear regression model with a binary endogenous explanatory variable (EEV) and weak instruments. By estimating a binary response model via maximum likelihood in the first step, the nonlinear fitted probability can be constructed as an alternative instrument for the binary EEV. I show that this two-step instrumental variables (IV) estimation procedure produces a consistent and asymptotically normal IV estimator, even though the alternate linear two stage least squares estimator is inconsistent with nonstandard asymptotics. Results are illustrated in an application evaluating the effects of electrification on employment growth.The remaining two chapters study statistical inference when the population is treated as finite. When the sample is a relatively large proportion of the population, finite population inference serves as a more appealing alternative to the usual infinite population approach. Nevertheless, the finite population inference methods that are currently available only cover the difference-in-means estimator or independent observations. Consequently, these methods cannot be applied to the many branches of empirical research that use linear or nonlinear models where dependence due to clustering needs to be accounted for in computing the standard errors. The second and third chapters fill in these gaps in the existing literature by extending the seminal work of Abadie, Athey, Imbens, and Wooldridge (2020).In the second chapter, I derive the finite population asymptotic variance for M-estimators with both smooth and nonsmooth objective functions, where observations are independent. I also find that the usual robust "sandwich" form standard error is conservative as it has been shown in the linear case. The proposed asymptotic variance of M-estimators accounts for two sources of variation. In addition to the usual sampling-based uncertainty arising from (possibly) not observing the entire population, there is also design-based uncertainty, which is usually ignored in the common inference method, resulting from lack of knowledge of the counterfactuals. Under this alternative framework, we can obtain smaller standard errors of M-estimators when the population is considered as finite.In the third chapter, I establish asymptotic properties of M-estimators under finite populations with clustered data, allowing for unbalanced and unbounded cluster sizes in the limit. I distinguish between two situations that justify computing clustered standard errors: i) cluster sampling induced by random sampling of groups of units, and ii) cluster assignment caused by the correlated assignment of "treatment" within the same group. I show that one should only adjust the standard errors for clustering when there is cluster sampling, cluster assignment, or both, for a general class of linear and nonlinear estimators. I also find the finite population cluster-robust asymptotic variance (CRAV) is no larger than the usual infinite population CRAV, in the matrix sense. The methods are applied to an empirical study evaluating the effect of tenure clock stopping policies on tenure rates.
Offering students a unifying theoretical perspective, this innovative text emphasizes nonlinear techniques of estimation, including nonlinear least squares, nonlinear instrumental variables, maximum likelihood and the generalized method of moments, but nevertheless relies heavily on simple geometrical arguments to develop intuition. One theme of the book is the use of artificial regressions for estimation, inference, and specification testing of nonlinear models, including diagnostic tests for parameter constancy, series correlation, heteroskedasticity and other types of misspecification. Other topics include the linear simultaneous equations model, non-nested hypothesis tests, influential observations and leverage, transformations of the dependent variable, binary response models, models for time-series/cross-section data, multivariate models, seasonality, unit roots and cointegration, and Monte Carlo methods, always with an emphasis on problems that arise in applied work.Explaining throughout how estimates can be obtained and tests can be carried out, the text goes beyond a mere algebraic description to one that can be easily translated into the commands of a standard econometric software package. A comprehensive and coherent guide to the most vital topics in econometrics today, this text is indispensable for all levels of students of econometrics, economics, and statistics on regression and related topics.