Suppose $\dot{x}=Ax$, where $A \in \mathbb{R}^{2\times2}$. Determine $A$ from the initial and final states

85 Views Asked by At

Two experiments were done:

  1. $x(0)= [1,1]^T$ and $x(1)=[4, -2]^T$

  2. $x(0)= [1, 2]^T$ and $x(1)=[5, -2]^T$

How can I find the $A$ matrix? I know how to find $e^{At}$, for $t=1$, using these $2$ experiments. If it is not possible, which method can I use to find with more data? Suppose $A$ has distinct eigenvalues.

3

There are 3 best solutions below

3
On

From the given data we can write

$$x(t=1)=\exp(A)x(t=0)=[I+A+1/2!A^2+...]x(t=0).$$

By the Cayley-Hamilton theorem, we know that we can reduce higher powers of $A$ to a linear combination of $I$ and $A$. So we know

$$x(t=1)=[\alpha_0I+\alpha_1A]x(t=0).$$

We have two initial values and final values pairs corresponding to $4$ conditions. But we have $6$ parameters ($4$ parameters in the matrix and the two additional parameters in the linear combination). So it is difficult to determine the parameters in this case.

If you had an additional pair of the initial condition and final condition, you should be able to uniquely determine the coefficients.

For real systems (with noise etc.), we would collect a lot of data $\mathcal{D}=\{(x_1,\dot{x}_1),...,(x_N,\dot{x}_N\}$ consisting of $\dot{x}$ and $x$ samples. Then we would combine them into a matrix formulation

$$[\dot{x}_1,...,\dot{x}_N]=A[x_1,...,x_N]$$

and approximate the system matrix $A$ by $\hat{A}$ using a least squares estimate.

I haven't thought it through but it should be something like

$$\hat{A}=[\dot{x}_1,...,\dot{x}_N][x_1,...,x_N]^T\left[[x_1,...,x_N][x_1,...,x_N]^T \right]^{-1}.$$

6
On

After you obtain $e^{A\,t}$ you can use the matrix logarithm. This not always well defined and does not have a unique solution, but often one can make some assumptions and still find a sensible answer. The easiest way to calculate this by hand is by using the eigen decomposition of $e^{A\,t}$ (assuming that it is diagonalizable), so

$$ e^{A\,t} = V\,D\,V^{-1}, $$

with $D$ a diagonal matrix and $V$ a matrix containing the eigenvectors. The matrix logarithm can then be obtained by taking the matrix logarithm of $D$, which can be obtained by taking the logarithm of it diagonal elements

$$ \log\left(e^{A\,t}\right) = A\,t = V\,\log(D)\,V^{-1}. $$

It can be noted that, even if $D$ is diagonal, the logarithm of the diagonal elements is not unique, namely one can always add an integer times $2\,\pi\,i$ to it (due to Euler's formula). However this will most likely make $A$ complex, which often does not make sense for real life systems.

0
On

We know that $x(1)=e^A x(0)$, so we can write $$ \begin{bmatrix}4 & 5 \\ -2 & -2\end{bmatrix} = e^A \begin{bmatrix}1 & 1 \\ 1 & 2\end{bmatrix}.$$ Then, we can easily obtain $$e^A = \begin{bmatrix}3 & 1 \\ -2 & 0\end{bmatrix}$$ which has eigenvalues of $1$ and $2$. Note that $e^A$ has the same eigenvectors with $A$ with $e^{\lambda_i}$ as eigenvalues where $\lambda_i$ is an eigenvalue of $A$ (Spectral Mapping Theorem). So $$ A = \begin{bmatrix}-1 & -1 \\ 2 & 1\end{bmatrix} \begin{bmatrix}0 & 0 \\ 0 & \ln2 \end{bmatrix} \begin{bmatrix}-1 & -1 \\ 2 & 1\end{bmatrix}^{-1} = \begin{bmatrix}2 & 1 \\ -2 & -1\end{bmatrix} \ln2$$