This question is taken from a past exam for Gilbert Strang's linear algebra course (Spring 1998).
Find a complete set of eigenvalues and eigenvectors for $\mathbf{A}=\begin{bmatrix}2&1&1\\1&2&1\\1&1&2\end{bmatrix}$. Write $\mathbf{b}=\begin{bmatrix}2&0&1\end{bmatrix}^\intercal$ as a linear combination of the eigenvectors and solve for $\mathbf{A}^{100}\mathbf{b}$.
The first two parts are easy enough; I find $\lambda_i=1,1,4$ with corresponding eigenvectors $$\mathbf{x}_i=\begin{bmatrix}-1\\1\\0\end{bmatrix},\begin{bmatrix}-1\\0\\1\end{bmatrix},\begin{bmatrix}1\\1\\1\end{bmatrix}$$ Then $\mathbf{b}$ can be obtained by the combination $\mathbf{x}_3-\mathbf{x}_1$.
The last part is what I'm not sure about. Letting $\mathbf{X}=\begin{bmatrix}\mathbf{x}_1&\mathbf{x}_2&\mathbf{x}_3\end{bmatrix}$, I can write $$\mathbf{A}=\mathbf{X}\begin{bmatrix}1&0&0\\0&1&0\\0&0&4\end{bmatrix}\mathbf{X}^{-1}$$ and so $$\mathbf{A}^{100}=\mathbf{X}\begin{bmatrix}1&0&0\\0&1&0\\0&0&4^{100}\end{bmatrix}\mathbf{X}^{-1}$$ Now, this being a time-sensitive exam question, I get a little suspicious that carrying out the multiplication of this resulting matrix by $\mathbf{b}$ will take too long, so I refer to the solution. (Note: There's a slight sign difference relative to the eigenvectors I use; the solution instead uses $\mathbf{y}_1=-\mathbf{x}_1$ so that $\mathbf{b}=\mathbf{x}_3\color{red}+\mathbf{y}_1$.) It simply says
$$\mathbf{A}^{100}\mathbf{b}=4^{100}\mathbf{x}_3+1^{100}\mathbf{y}_1=\begin{bmatrix}4^{100}+1\\4^{100}-1\\4^{100}\end{bmatrix}$$
which seems to be completely circumventing multiplication by $\mathbf{X}$ and its inverse.
It's not immediately clear to me why I can do something like this: if $\mathbf{x}$ is an eigenvector of $\mathbf{A}=\mathbf{X\Lambda X}^{-1}$, then for $k\in\mathbb{N}$, $$\mathbf{A}^k\mathbf{x}=\mathbf{X}\mathbf{\Lambda}^k\mathbf{X}^{-1}\mathbf{x}=\mathbf{\Lambda}^k\mathbf{x}$$ Does it have something directly to do with $\mathbf{x}$ being in the column space of $\mathbf{X}$?
If $\lambda$ is an eigenvalue of $A$ corresponding to eigenvector $x$, then $\lambda^n$ is an eigenvalue of $A^n$ corresponding to eigenvector $x$. This can be proved by induction. The case $n=1$ is just the original assumption. Now assume $A^{n-1}x=\lambda^{n-1}x$. Then $A^nx=A(A^{n-1}x)=A(\lambda^{n-1}x)=\lambda^{n-1}Ax=\lambda^{n-1}\lambda x=\lambda^n x$.
Applying this to your problem \begin{align*} A^{100}b&=A^{100}(x_3-x_1)\\ &=A^{100}x_3-A^{100}x_1\\ &=4^{100}x_3-1^{100}x_1\\ &=\begin{bmatrix}4^{100}+1\\4^{100}-1\\4^{100} \end{bmatrix}. \end{align*}