are these two linear approximation same

214 Views Asked by At

Are these two concepts equivalent? If yes, how? Could anyone tell me?

We have $x_1=\begin{bmatrix}x_{11}\\ \vdots\\ \vdots\\ x_{n1}\end{bmatrix},\;\cdots\cdots, x_m=\begin{bmatrix}x_{1m}\\ \vdots\\ \vdots\\ x_{nm}\end{bmatrix}$.

Now we form a matrix $X_1$ with $m-1$ column, i.e. from $x_1$ to $x_{m-1}$, and create another matrix $X_2$ with $x_2$ column to $x_m$ column. Suppose we assume that the data is generated from a linear dynamical system. In that case, $$\boxed{X_2=AX_1}$$ where $A$ is the linear operator, i.e. another matrix which needs to be computed by some numerical algorithm is not the point of this question, though.

Now the next concept is if we assume the data is from a linear dynamical system, then we can assume $$\boxed{x_2=Ax_1, x_3=Ax_2, x_4=Ax_3,\cdots ,x_m=Ax_{m-1}.}$$

Now my question is, mathematically, are they equivalent? Could anyone tell me how to convince a mixed audience? Thank you very much.

As @AlpUzman asked, what does ''they'' mean? I meant the above two different linearization methods to approximate $A$.

1

There are 1 best solutions below

3
On

Yes, they are. The equivalence of the condition "$x_2=Ax_1$, $x_3=Ax_2$, $x_4=Ax_3$,$\cdots$,$x_m=Ax_{m-1}$" and the condition "$X_2=AX_1$" can be seen as a straightforward application of the definition of matrix multiplication on matrices $A$ and $X_1$ of compatible dimensions.

You can mention equivalence as a simple fact to the audience that has learned the introduction to linear algebra well.

To an audience unfamiliar with the introduction to linear algebra, you may want to demonstrate the equivalence step by step, writing out the definition of matrix multiplication.

To a mixed audience, well, it is always up to you what to do. If I suspect a significant of them may doubt the equivalence, I may use the following notations to convey it.

$$X_1=\begin{bmatrix}x_1&x_2&x_3&\cdots& x_{m-1}\end{bmatrix}.$$ $$A\,X_1=\begin{bmatrix}Ax_1&Ax_2&Ax_3&\cdots& Ax_{m-1}\end{bmatrix}=\begin{bmatrix}x_2&x_3&x_4&\cdots& x_{m}\end{bmatrix}=X_2.$$

You and your audience may appreciate why $x_i$'s are written as column vectors.


Here is a relevant fact on matrix multiplication. Suppose $C=\begin{bmatrix}c_1&c_2&c_3&\cdots& c_{m}\end{bmatrix}$ is a $n\times m$ matrix, where $c_i$'s are column vectors of length $n$. Suppose $A$ is a $k\times n$ matrix. Then the $k\times m$ matrix $AC=\begin{bmatrix}Ac_1&Ac_2&Ac_3&\cdots& Ac_{m}\end{bmatrix}$.