Eigenvectors are the vectors who after multiplying them by matrix it is the same as if you multiplied them by a constant (the eigenvalue): $$ A \vec{v} = \lambda \vec{v} $$ If you concatenate all the eigenvectors to a matrix $Q$, and place the eigenvalues in a diagonal matrix $\Lambda$, you can write it down like this: $$ A Q = Q \Lambda \\ A = Q \Lambda Q^{-1} $$
My question is if Q is guaranteed to be invertible?
You ought to be a bit careful about saying "all the eigenvectors". There are a lot of them. But a maximal, linearly independent subset of them works.
Say $A$ is $n\times n$. If $A$ has $n$ distinct eigenvalues, then the collection of $n$ eigenvectors that make up the columns of $Q$ must necessarily be linearly independent, so $Q$ is invertible.
If $A$ doesn't have $n$ distinct eigenvalues, then it may still be possible to find $n$ linearly independent eigenvectors. But there is no guarantee.
For instance, consider the matrix $$ \pmatrix{1&1\\0&1} $$ It has $1$ as its only eigenvalue, and it's impossible to find $2$ linearly independent eigenvectors: the only eigenvectors lie on the span of $\left(\begin{smallmatrix}1\\0\end{smallmatrix}\right)$. So in this case you cannot find such a $Q$.