How do I make sense of $x'=Ax$ is solved by $x(t)=e^{Jt}x(0)$ when $A$ is nondiagonalizable.
When we have an $n\times n$ system of ODE's, $\dfrac{dx}{dt}=Ax$ and if $A=VD_AV^{-1}$ is diagonalizable.
Substitute $x=Vy$,
$$ \dfrac{dx}{dt}=Ax\implies V\dfrac{dy}{dt}=AVy\implies \dfrac{dy}{dt}=V^{-1}AVy=D_Ay\\ \begin{bmatrix}y_1'\\\vdots\\y_n'\end{bmatrix}=\begin{bmatrix}\lambda_1&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&\lambda_n\end{bmatrix}\begin{bmatrix}y_1\\\vdots\\y_n\end{bmatrix}\implies y_i(t)=c_ie^{\lambda_it}\\ x(t)=Vy(t)=\begin{bmatrix}v_1&\cdots&v_n\end{bmatrix}\begin{bmatrix}c_1e^{\lambda_1t}\\\vdots\\c_ne^{\lambda_nt}\end{bmatrix}\implies x(t)=c_1v_1e^{\lambda_1t}+\cdots+c_nv_ne^{\lambda_nt}\\ x(0)=V\begin{bmatrix}c_1\\\vdots\\c_n\end{bmatrix}=Vc\implies c=V^{-1}x(0) $$ $$ x(t)=V\begin{bmatrix}e^{\lambda_1t}&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&e^{\lambda_nt}\end{bmatrix}\begin{bmatrix}c_1\\\vdots\\c_n\end{bmatrix}=V\begin{bmatrix}e^{\lambda_1t}&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&e^{\lambda_nt}\end{bmatrix}c=V\begin{bmatrix}e^{\lambda_1t}&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&e^{\lambda_nt}\end{bmatrix}V^{-1}x(0)=e^{At}x(0) $$
where $e^{At}=V\begin{bmatrix}e^{\lambda_1t}&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&e^{\lambda_nt}\end{bmatrix}V^{-1}$
I think it makes sense why we define $e^{At}=Ve^{D_At}V^{-1}$ the way it is and how it comes into picture and we can also prove the Taylor series expansion is the same thing.
When $A$ is nondiagonalizable $A$ can be decomposed into $A=XJX^{-1}\implies J=X^{-1}AX$ is called the Jordan canonical form and $X$ contains generalized eigenvectors $x_k$ such that $Ax_k=λ_kx_k$ , or $Ax_k=λ_kx_k+x_{k−1}$ and $J=X^{-1}AX=\begin{bmatrix}J_1&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&J_s\end{bmatrix}=J_1\oplus\cdots\oplus J_s$ where $J_i=\begin{bmatrix}\lambda_1&1&0&\cdots&0\\0&\lambda_2&1&\cdots&0\\\vdots&\vdots&\vdots&\ddots&\vdots\\0&0&0&\cdots&\lambda_i\end{bmatrix}$
Substitute $x=Az$
$$ x'=Ax\implies Bz'=ABz\\ z'=B^{-1}ABz=Jz\\ \begin{bmatrix}z'_1\\\vdots\\z'_n\end{bmatrix}=\begin{bmatrix}J_1&\cdots&0\\\vdots&\ddots&\vdots\\0&\cdots&J_s\end{bmatrix}\begin{bmatrix}y_1\\\vdots\\z_n\end{bmatrix} $$ I don't think it goes anywhere with it !
In this case how do I make sense of the fact that $x(t)=e^{Jt}x(0)$ solves $x'=Ax$ similar to the one we obtained above ?
The solution of the equation
$$ x'(t) = A x(t) $$
is
$$ x(t) = e^{t A } x(0) $$ wether or not $A$ is diagonalizable. This can be easily seen differentiating the solution. (Added: I give a proof below). Note that $e^{tA}$ and $A$ commute with each other so there is no problem with that.
The Jordan canonical form is a way to compute the matrix exponential. But that is not needed here.
If there is an invertible matrix $V$ such that
$$ A = V D V^{-1} $$
then
$$ (t A)^n = t^n V D^n V^{-1} $$
and so
$$ \exp{(tA) } = V e^{tD} V^{-1}. $$
Note $D$ can be whatever for this to hold. The point is that when $D$ is in Jordan block form the computation of $e^{tD}$ is greatly simplified.
Added
I will show here that the exponential solves the differential equation. In fact I will show something slightly more general, namely:
Let $B(t)$ be a differentiable matrix valued function. Provided, $B'(t)$ commutes with $B(t)$ (for all $t$ in the needed interval) then we have
$$ \frac{d}{dt} e^{B(t)} = B'(t) e^{B(t)} $$
Proof. -
\begin{align} \frac{d}{dt} [B(t)]^n &= B'(t) [B(t)]^{n-1} + B(t) B'(t) [B(t)]^{n-2} +\ldots \\ & + \ldots [B(t)]^{n-1} B'(t) \\ &= n B'(t) [B(t)]^{n-1} \end{align}
where the last equality stems from commutativity. Hence
\begin{align} \frac{d}{dt} e^{B(t)} &= \frac{d}{dt}\sum_{n=0}^{\infty} \frac{[B(t)]^n}{n!} \\ &= \sum_{n=0}^{\infty} \frac{n B'(t) [B(t)]^{n-1}}{n!} \\ &= B'(t) \sum_{n=1}^{\infty} \frac{[B(t)]^{n-1}}{(n-1)!} \\ &= B'(t) e^{B(t)}. \end{align}
Now to prove the claim simply consider $B(t) = t A$, $\square.$