I'd be grateful if someone could explain me what's the link between stationarity conditions of AR(p) processes in theory and in the practical sense. I was given the following short definition about weakly stationarity: a process is weakly stationary if the fist and second moments of the distribution are finite and time-invariant. As a practical criterion to check for stationarity, I've understood that we have to rely on the analysis of the roots of the characteristic equation associated to the lag polynomial, i.e all the roots have to lie outside the unit circle (greater than 1 in absolute value). As I mentioned earlier, I don't understand why the condition on the roots should imply finite first and second moments. Is this linked to the convergence of some lag polynomial, or to complex numbers? This seems to suggest a criterion for the ratio of some series... Moreover, what's the relation between stationarity and invertibility condition? They seem to me as being the same thing for AR(p). What's the statistical explanation? How does it work in general? (I.e can I say that stationarity implies invertibility of the lag polynomial? Is it necessary condition or both necessary and sufficient?) Thank you very much.
2026-03-29 15:16:43.1774797403
Intuitive and mathematical explanation of stationarity for AR(p)?
517 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in TIME-SERIES
- Expected Value of a time series model
- Calculating the Mean and Autocovariance Function of a Piecewise Time Series
- Autocovariance of a Sinusodial Time Series
- Why do we use a sequence of random variables to model **Univariate** Time Series?
- Calculating the conditional probability of a location given a specific time frame
- Determining first element of an AR1 model
- Finding ACVF of An AR(3) Process
- Question on limiting form of Doob's submartingale inequality
- $x_t = A\sin(t) + B\cos(t)$ is deterministic
- Explaining the fit of Correlation and Covariance in AR and MA models
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Note: I will not go through all the details in this answer, but merely lay out the intuition and ideas. Some of the arguments are a bit handwavy and taken as truths when indeed they require (non-trivial) proofs.
Let us consider an $\operatorname{AR}(p)$ model given by $X_t = Z_t + \phi_1X_{t-1} + \phi_2X_{t-2} + \ldots + \phi_pX_{t-p}$ where $\{Z_t\}_t$ is a white noise process.
Define the backshift operator $B$ such that $BX_t = X_{t-1}$. Consider the polynomial $$ \Phi(z) = 1-\phi_1z - \phi_2z^2 - \ldots - \phi_pz^p, \quad z\in \mathbb C. $$
Then we can rewrite $X_t$ with the polynomial in the backshift operator, i.e.
$$ \Phi(B) X_t = Z_t. $$ The question is when this representation is stationary, which is not obvious. So we take a step back and consider $p = 1$ (i.e. an $\operatorname{AR}(1)$ model). Then the polynomial is $\Phi(z) = 1-\phi z$. This constitutes to $$ X_t = Z_t +\phi X_{t-1}. \qquad \qquad (1) $$ If we iterate (1) a couple of times by substituting $X_{t-1}$, $X_{t-2}$, etc. we get \begin{align*} X_t &= Z_t + \phi(Z_{t-1} + X_{t-2}) \\&= Z_t + \phi Z_{t-1} + \phi(\phi Z_{t-2} + X_{t-3}) \\&= \ldots \\&= Z_t + \phi Z_{t-1} + \phi^2 Z_{t-2} + \ldots + \phi^{n}Z_{t-n}\phi^n + \phi^{n+1}X_{t-n}. \end{align*} This suggests that $X_t$ be given by the representation $$ X_t = \sum_{k = 0}^\infty \phi^kZ_{t-k}. $$ If this is true, then we can rule out $\lvert \phi \rvert = 1$ immediately; otherwise the sum would not converge.
Thus we consider $\lvert \phi\rvert < 1$. (*) One would now have to show that this representation gives a stationary time series by satisfying the two moment requirements (which indeed it does).
Now, what does that have to do with the polynomial $\Phi$? Well, indeed $\lvert \phi\rvert < 1$ means that $\Phi(z) = 1-\phi z$ does not have any roots for $\lvert z\rvert \leq 1$. Or equivalently, the inverse $\Phi(z)^{-1}$ is a convergent power series in the one-dimensional "circle" $\lvert z\rvert \leq 1$: $$ \sum_{k = 0}^\infty \phi^k z^k = \frac1{1-\phi z } = \Phi(z)^{-1} , \qquad \lvert \phi\rvert < 1, \lvert z\rvert \leq 1. $$
For the general setting where $\Phi(z) = 1- \sum_{k = 0}^p \phi_k z^k$ we want a similar property of the inverse. That means no roots for the polynomial, i.e. $\Phi(z)\neq 0$ for $\lvert z\rvert = 1$, $z\in \mathbb C$. Note how we changed to complex numbers: That happened because the higher order polynomials may have complex roots. (**)
(*) Note that for $\lvert \phi \rvert >1$ we would iterate by substituting $Z_t$, $Z_{t+1}$, etc., suggesting the representation $$ X_t = -\sum_{k = 1}^\infty \phi^{-k}Z_{t+k}. $$ However, this means the time series depends on its future, which in practice is nonsense. This is why you often see the requirement of causality to avoid these cases.
(**) Causality here constitutes to no roots in the circle $\lvert z\rvert \leq 1$.