How does one solve a system of the form
$$\ddot{\vec{x}}=A\vec{x}+\vec{b},$$
where $A\in \mathbb{R}^{2\times 2}$ is a matrix, without reducing this system into four first-order equations?
I know that it is possible, and there is a trick which is related to some eigenvector equations, but I'm not sure what the trick is.
Particular (steady-state) solution: $\vec{x} = -A^{-1} \vec{b}$ (if $A$ is invertible).
For the homogeneous equation $\ddot{\vec{x}} = A \vec{x}$, consider solutions of the form $\vec{x}(t) = e^{\lambda t} \vec{u}$. This is a solution if and only if $\lambda^2 \vec{u} = A \vec{u}$, i.e. $\vec{u}$ is an eigenvector of $A$ for eigenvalue $\lambda^2$. If the $n \times n$ matrix $A$ has $n$ linearly independent eigenvectors $\vec{u}_j$ corresponding to nonzero eigenvalues $\mu_j$, we get the general solution (with $2n$ parameters $c_j$ and $d_j$)
$$ \vec{x} = -A^{-1} \vec{b} + \sum_{j=1}^n \left(c_j e^{\sqrt{\mu_j}t} + d_j e^{-\sqrt{\mu_j}t}\right) \vec{u}_j$$
If there are "missing" eigenvectors or $0$ is an eigenvalue, things get slightly more complicated: you have to throw in some solutions involving polynomials as well as exponentials.