Possible to solve $Ax=b$ for $A$, knowing all of $A$ but row $1$?

99 Views Asked by At

I'm trying to teach myself a bit of time series analysis. I came across this equation that looks like it may be solvable, but I can't be sure. I've just gotten through the first few chapters of Hamiltons Time-Series Analysis, so I apologize if this question is a bit premature.

So its the standard $Ax = b$ equation. We know $x$ and $b$ vectors, and all but the first row of matrix $A$. The rest of matrix $A$ looks like the identity matrix shifted down a row. So if $A$ was $3\times 3$, row $1 = [x,~y,~z]$; row $2 = [1,~0,~0]$; row $3 = [0,~1,~0]$.

Since $xx^T$ wouldn't have an inverse, I can't move it to the other side. So I'm not sure if I would be able to find a solution to this, it just looks so close. I think in the book they are assuming these $x,y,z$ (phi's) are known at this point. But it would be much more useful to find a solution/approximation for these.

If anyone is familiar with time series or ARMA/ARIMA models, do you know how these phi values are approximated? I know phi can be approximated as $1 - \frac{d}{2}$ ($d = $ Durbin Watson statistic) for a first order model. But how are phi values approximated for higher order models?

1

There are 1 best solutions below

0
On

When you multiply $A$ with $x$ to get $b$ then the first row of $A$ will only influence the first element of $b$. There is therefore not a unique solution for $A$ when dealing with a $n\times n$ matrix (with $n>1$) as we have $n$ unknowns, but only one equation constraining them.

For your $3\times 3$ example we get, letting $a_1,a_2,a_3$ denote the unknowns of row $1$,

$$Ax = \pmatrix{a_1 & a_2 & a_3\\ \ldots & \ldots & \ldots \\ \ldots & \ldots & \ldots }\pmatrix{x_1\\x_2\\x_3} = \pmatrix{a_1x_1+a_2x_2+a_3x_3\\\ldots\\\ldots} = b = \pmatrix{b_1\\b_2\\b_3}$$ so we need $a_1x_2+a_2x_2 + a_3x_3 = b_1$. This is just one equation with $3$ unknowns $\implies$ infinitely many solutions for $\{a_1,a_2,a_3\}$.

To construct a solution let the first row of $A$ be the vector $\vec{a}=\{a_1,a_2,\ldots,a_n\}$ and pick any vector $\vec{c}$ satisfying $\vec{c}\cdot \vec{x} \not = 0$ then

$$\vec{a} = \frac{b_1}{\vec{c}\cdot \vec{x}}\vec{c}$$

is a solution.