Can we find/express all matrices, so that: $${\bf P}^2={\bf P+I}$$
Own work:
For the eigenvalues must then hold:
$$\lambda^2-\lambda-1=0$$ In other words : $$\lambda_1 = \frac{1-\sqrt{5}}2\\\lambda_2 = \frac{1+\sqrt{5}}2$$
So as long as we construct a matrix with any combination of such eigenvalues, we shall be fine.
$${\bf P =SDS}^{-1},\\ {\bf D} = \text{diag}(\text{AnyCombinationOf}\{\lambda_1,\lambda_2\})$$ for any matrix $\bf S$. For my experiments this does seem to give numerically reasonable results for lots of such generated $\bf P$ of sizes $2-6$. These numerical experiments do however not disprove that such matrices $\bf P$ could exist which do not fit this description.
Does this description or representation catch all possible such matrices, or do we miss some? How to prove?
Yes, every such matrix will fit that description. In particular, if $P^2 - P - I = 0$, then the minimal polynomial of $P$ must divide $x^2 - x - 1 = (x - \lambda_1)(x - \lambda_2)$. So, the minimal polynomial of $P$ has distinct (i.e. non-repeating) roots. It follows that $P$ is indeed diagonalizable.
The only tricky fact here is that a matrix will be diagonalizable if and only if its minimal polynomial has non-repeating roots. For a reference on that, confer chapter 3 of Horn and Johnson's Matrix Analysis.
For a more interesting proof, you can show that any vector $x$ can be decomposed into $$ x = \frac{1}{\sqrt{5}}[(\lambda_2I - P)x - (\lambda_1 I - P)x]. $$ Argue that $(\lambda_1I - P)x, (\lambda_2 I - P)x$ are necessarily eigenvectors of $P$.