Robust and Nonlinear Time Series Analysis (e-bog) af -
Martin, D. (redaktør)

Robust and Nonlinear Time Series Analysis e-bog

875,33 DKK (inkl. moms 1094,16 DKK)
Classical time series methods are based on the assumption that a particular stochastic process model generates the observed data. The, most commonly used assumption is that the data is a realization of a stationary Gaussian process. However, since the Gaussian assumption is a fairly stringent one, this assumption is frequently replaced by the weaker assumption that the process is wide~sense sta...
E-bog 875,33 DKK
Forfattere Martin, D. (redaktør)
Forlag Springer
Udgivet 6 december 2012
Genrer Probability and statistics
Sprog English
Format pdf
Beskyttelse LCP
ISBN 9781461578215
Classical time series methods are based on the assumption that a particular stochastic process model generates the observed data. The, most commonly used assumption is that the data is a realization of a stationary Gaussian process. However, since the Gaussian assumption is a fairly stringent one, this assumption is frequently replaced by the weaker assumption that the process is wide~sense stationary and that only the mean and covariance sequence is specified. This approach of specifying the probabilistic behavior only up to "e;second order"e; has of course been extremely popular from a theoretical point of view be- cause it has allowed one to treat a large variety of problems, such as prediction, filtering and smoothing, using the geometry of Hilbert spaces. While the literature abounds with a variety of optimal estimation results based on either the Gaussian assumption or the specification of second-order properties, time series workers have not always believed in the literal truth of either the Gaussian or second-order specifica- tion. They have none-the-less stressed the importance of such optimali- ty results, probably for two main reasons: First, the results come from a rich and very workable theory. Second, the researchers often relied on a vague belief in a kind of continuity principle according to which the results of time series inference would change only a small amount if the actual model deviated only a small amount from the assum- ed model.