Can someone explain why we use past errors to predict future data values in Moving Average models? It just doesn't make sense why we use past errors to make predictions. Using past values, as in $\text{AR}$ models, makes sense. But why past errors? Particularly for models with only Moving Average components -- would anybody ever use $\text{ARMA}(0,1)$ models and why?
2026-02-22 23:12:41.1771801961
Understanding Moving Average Models
70 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in AVERAGE
- Calculating an average of other averages
- Is the average of a quotient ever the quotient of the averages?
- Weighted Average?
- Is there a way to calculate or estimate the trimmed mean given only summary statistics?
- Question on average.
- Average and standard deviation equation system
- What is $\cfrac 1n$ in this expression?
- Link between average and integrals
- Expected value based on probability
- Division returns within fixed percentage in a financial game.
Related Questions in TIME-SERIES
- Expected Value of a time series model
- Calculating the Mean and Autocovariance Function of a Piecewise Time Series
- Autocovariance of a Sinusodial Time Series
- Why do we use a sequence of random variables to model **Univariate** Time Series?
- Calculating the conditional probability of a location given a specific time frame
- Determining first element of an AR1 model
- Finding ACVF of An AR(3) Process
- Question on limiting form of Doob's submartingale inequality
- $x_t = A\sin(t) + B\cos(t)$ is deterministic
- Explaining the fit of Correlation and Covariance in AR and MA models
Related Questions in VECTOR-AUTO-REGRESSION
- Determining first element of an AR1 model
- Collinearity in Vector Autoregression and Impulse Response (Time Series)
- Can an autoregressive process of order $k$ be expressed as a $k$-step Markov chain?
- AR(1) to Ornstein-Uhlenbeck for AR(1) process of the form $\ln z_{t+1}=\rho \ln z_t+\sigma \sqrt{(1-\rho^2)}\epsilon_t$
- High F-Statistic Value and negative Probability for Granger Causality Result, Interpretation?
- Running coint_johansen cointegration test gives : LinAlgError: Matrix is not positive definite
- How do I derive the following expression for the sum of orthogonal matrices?
- Random Coefficient AR(1) and Kalman filtering
- Precision matrix of AR(2) matrix
- Understanding Moving Average Models
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let's say you have a time series $(x_t)_{t\in\mathbb N}$ and you observe it fluctuates around a number $\mu$. Then you may postulate that $$x_t = \mu + \epsilon_t.$$
Using some observations you compute $\hat\mu$. With this number, you may compute your estimated deviations as $\hat\epsilon_t = x_t - \hat\mu$ and plot it. You observe that the sign of these tend to flip, ie. if $\epsilon_t>0$, then you tend to observe $\epsilon_{t+1}<0$. This suggest a negative autocorrelation for $\epsilon_t$, so now you may think "perhaps $\epsilon$ follows an AR(1)" and write $$\epsilon_t = \theta\epsilon_{t-1} + \eta_t.$$
You plug this back in to obtain $$x_t = \mu + \eta_t + \theta\epsilon_{t-1},$$ which is not exactly an MA(1) but some notation changes may be cleverly done.