Download Free Asymptotically Efficient Estimates Of The Parameters Of A Moving Average Time Series Book in PDF and EPUB Free Download. You can read online Asymptotically Efficient Estimates Of The Parameters Of A Moving Average Time Series and write the review.

The thesis is concerned with the estimation of the parameters of a moving average time series, (x sub t, t= 0, plus or minus 1, plus or minus 2 ...), of order M. By definition, such a series has the representation x sub t = (eta sub t) + (b sub 1)(eta sub (t-1)) + (b sub 2)(eta sub (t-2)) + ... + (b sub M)(eta sub (+-M)) for some series of uncorrelated, identically distributed random variables eta sub t, t = 0, plus or minus 1, plus or minus 2 ...). It is assumed that the process has mean zero and is a Gaussian process; hence eta sub t has a normal distribution with mean and some unknown variance (sigma sub n) squared. The goal is to find asymptotically normal and efficient estimates of the parameters of the model. (Author).
We establish asymptotic normality and consistency for rank-based estimators of autoregressive-moving average model parameters. The estimators are obtained by minimizing a rank-based residual dispersion function similar to the one given by L.A. Jaeckel [Ann. Math. Stat. Vol. 43 (1972) 1449-1458]. These estimators can have the same asymptotic efficiency as maximum likelihood estimators and are robust. The quality of the asymptotic approximations for finite samples is studied via simulation.
A method is presented for the estimation of the parameters in the vector autoregressive moving average time series model. The estimation procedure is derived from the maximum likelihood approach and is based on Newton-Raphson techniques applied to the likelihood equations. The resulting two-step Newton-Raphson procedure is computationally simple, involving only generalized least squares estimation in the second step. This Newton-Raphson estimator is shown to be asymptotically efficient and to possess a limiting multivariate normal distribution. (Author).
The author considers estimation procedures for the moving average model of order q. Walker's method uses k sample autocovariances (k> or = q). Assume that k depends on T in such a way that k nears infinity as T nears infinity. The estimates are consistent, asymptotically normal and asymptotically efficient if k = k (T) dominates log T and is dominated by (T sub 1/2). The approach in proving these theorems involves obtaining an explicit form for the components of the inverse of a symmetric matrix with equal elements along its five central diagonals, and zeroes elsewhere. The asymptotic normality follows from a central limit theorem for normalized sums of random variables that are dependent of order k, where k tends to infinity with T. An alternative form of the estimator facilitates the calculations and the analysis of the role of k, without changing the asymptotic properties.
Developments in Statistics, Volume 4 reviews developments in the theory and applications of statistics, covering topics such as time series, identifiability and model selection, and missing data. The application of structured exploratory data analysis to human genetics, specifically, the mode of inheritance, is also considered. Comprised of four chapters, this volume begins with an introduction to spectrum parameter estimation in time series analysis, restricting the discussion to the simplest univariate (that is, scalar) real-valued time series X(t). An accurate formulation of the general problem is presented. The accuracy of different consistent estimates obtained for large but fixed values of T (maximum likelihood estimates, Whittle's estimates, and simplified asymptotically efficient estimates) is also compared. The next chapter deals with identifiability and modeling in econometrics, focusing on the theoretical framework relating realization theory, identification, and parametrization. The realization theory is illustrated on various levels of generality by means of examples related to econometrics, along with some advanced applications of system theory. The book also examines inference on parameters of multivariate normal populations when some data are missing before concluding with an evaluation of structured exploratory data as applied to the study of the mode of inheritance. This monograph will be of interest to students and practitioners of statistics.
. . ) (under the assumption that the spectral density exists). For this reason, a vast amount of periodical and monographic literature is devoted to the nonparametric statistical problem of estimating the function tJ( T) and especially that of leA) (see, for example, the books [4,21,22,26,56,77,137,139,140,]). However, the empirical value t;; of the spectral density I obtained by applying a certain statistical procedure to the observed values of the variables Xl' . . . , X , usually depends in n a complicated manner on the cyclic frequency). . This fact often presents difficulties in applying the obtained estimate t;; of the function I to the solution of specific problems rela ted to the process X . Theref ore, in practice, the t obtained values of the estimator t;; (or an estimator of the covariance function tJ~( T» are almost always "smoothed," i. e. , are approximated by values of a certain sufficiently simple function 1 = 1
The subject of time series is of considerable interest, especiallyamong researchers in econometrics, engineering, and the naturalsciences. As part of the prestigious Wiley Series in Probabilityand Statistics, this book provides a lucid introduction to thefield and, in this new Second Edition, covers the importantadvances of recent years, including nonstationary models, nonlinearestimation, multivariate models, state space representations, andempirical model identification. New sections have also been addedon the Wold decomposition, partial autocorrelation, long memoryprocesses, and the Kalman filter. Major topics include: * Moving average and autoregressive processes * Introduction to Fourier analysis * Spectral theory and filtering * Large sample theory * Estimation of the mean and autocorrelations * Estimation of the spectrum * Parameter estimation * Regression, trend, and seasonality * Unit root and explosive time series To accommodate a wide variety of readers, review material,especially on elementary results in Fourier analysis, large samplestatistics, and difference equations, has been included.
The vector moving average process is a stationary stochastic process, where the unobservable process consists of independently identically distributed random variables. The matrix parameters are estimated from the observations. The likelihood function is derived under normality and to solve the maximum likelihood equations the Newton-Raphson and Scoring methods are used. The estimation problem is considered in the time and frequency domains. Asymptotic efficiency of the estimates is established.