Two experiments were done:
$x(0)= [1,1]^T$ and $x(1)=[4, -2]^T$
$x(0)= [1, 2]^T$ and $x(1)=[5, -2]^T$
How can I find the $A$ matrix? I know how to find $e^{At}$, for $t=1$, using these $2$ experiments. If it is not possible, which method can I use to find with more data? Suppose $A$ has distinct eigenvalues.
From the given data we can write
$$x(t=1)=\exp(A)x(t=0)=[I+A+1/2!A^2+...]x(t=0).$$
By the Cayley-Hamilton theorem, we know that we can reduce higher powers of $A$ to a linear combination of $I$ and $A$. So we know
$$x(t=1)=[\alpha_0I+\alpha_1A]x(t=0).$$
We have two initial values and final values pairs corresponding to $4$ conditions. But we have $6$ parameters ($4$ parameters in the matrix and the two additional parameters in the linear combination). So it is difficult to determine the parameters in this case.
If you had an additional pair of the initial condition and final condition, you should be able to uniquely determine the coefficients.
For real systems (with noise etc.), we would collect a lot of data $\mathcal{D}=\{(x_1,\dot{x}_1),...,(x_N,\dot{x}_N\}$ consisting of $\dot{x}$ and $x$ samples. Then we would combine them into a matrix formulation
$$[\dot{x}_1,...,\dot{x}_N]=A[x_1,...,x_N]$$
and approximate the system matrix $A$ by $\hat{A}$ using a least squares estimate.
I haven't thought it through but it should be something like
$$\hat{A}=[\dot{x}_1,...,\dot{x}_N][x_1,...,x_N]^T\left[[x_1,...,x_N][x_1,...,x_N]^T \right]^{-1}.$$