Show that an $n\times n$ matrix $A$ is similar to the companion matrix for $p_A(t)$ if and only if there exists a vector $x$ such that $$x, Ax, \ldots, A^{n-1}x$$ is a basis for $\mathbb C^n$.
The companion matrix for $p(t) = t^n + a_{n-1}t^{n-1} + \cdots + a_1t + a_0$ is given by $$A_p = [e_2 \;|\; e_3 \;|\; \cdots \;|\; e_n \;|\; -{\bf a}]\;\;,\;\; { \bf a} = (a_0, a_1, \ldots , a_{n-1})^T$$
My attempt: I tried to prove the backwards direction because I had a little bit of an idea on what to do...
Because $x, Ax, \ldots, A^{n-1}x$ is a basis for $\mathbb C^n$, we know that $x, Ax, \ldots, A^{n-1}x$ spans $\mathbb C^n$ and is linearly independent with $n$ vectors. Hence, we may create an $n\times n$ matrix $S$ such that $$S = [x \;|\; Ax \;|\; \cdots \;|\; A^{n-1}x],$$ of which by Equivalent Conditions for a Nonsingular Matrix, we have that $S$ is invertible because its columns are linearly independent. So $S^{-1}$ exists.
Questions: How can I show that $A = S^{-1}A_pS$? My method might be a lost cause unfortunately, so I apologize if this question is somewhat dumb. I could probably show it by multiplying out $S^{-1}AS$ and showing it equals $A$, but I'm not entirely sure what $S^{-1}$ looks like in terms of its columns I've defined.
Thus, the more important question--What does the inverse of $S$ look like exactly given its columns? Is it even possible to say?
You need to notice that by Cayley-Hamilton you have $P_A(A)=0$, that is $$\tag{1} A^n+a_{n-1}A^{n-1}+\cdots+a_1A+a_0I=0, $$ from where you get $$ A^n=-a_{n-1}A^{n-1}-\cdots-a_1A-a_0I.$$
Now compute that $$AS=\begin{bmatrix}Ax&A^2x&\cdots&A^{n-1}x\end{bmatrix}=SA_p.$$ When you calculate $SA_p$ the last column will be $-a_0x-a_1Ax-\cdots-A^{n-1}x=A^nx$ by $(1)$.
Since $S$ is invertible and $SA_p=AS$, we get that $SA_pS^{-1}=A$.
For the converse you need to assume that $A=SA_pS^{-1}$ for some matrix $S$. Take $x=Se_1$. You can quickly check that $A_p^ke_1=e_{k+1}$, for $k=1,\ldots,n$. So $$ e_1,A_pe_1,A_p^2e_1,\ldots,A_p^{n-1}e_1 $$ are linearly independent. If $$c_0x+c_1Ax+c_2A^2x+\cdots+c_{n-1}A^{n-1}x=0,$$ we have $$ c_0Se_1+c_1SA_pe_1+c_2SA_p^2e_1+\cdots+c_{n-1}SA_p^{n-1}e_1=0; $$ multiplying on the left by $S$ we get $$ c_0e_1+c_1A_pe_1+c_2A_p^2e_1+\cdots+c_{n-1}A_p^{n-1}e_1=0, $$ which forces $c_0=c_1=\cdots=c_{n-1}=0$ and so $x,Ax,\ldots,A^{n-1}x$ are linearly independent.