A is not explicitly given, but A satisfies the following.
$A\begin{bmatrix}1\\1\\0\end{bmatrix}=\begin{bmatrix}1\\1\\0\end{bmatrix},A\begin{bmatrix}1\\0\\1\end{bmatrix}=-\begin{bmatrix}1\\0\\1\end{bmatrix}, A\begin{bmatrix}0\\1\\1\end{bmatrix}=2\begin{bmatrix}0\\1\\1\end{bmatrix}$
Is it possible to find an invertible matrix $P$ such that $P^{-1}AP=D=\begin{bmatrix}1&0&0\\0&-1&0\\0&0&2\end{bmatrix}$
Are those matrices in the equations have something to do with $P$?
Are they colums of $P$? If yes, why?
I'll try to give more insight and intuition than other answers provided thus far. The key here is to have a very strong grasp of the following fact that is easy to verify by direct computation (of course it holds for general matrices but it's helpful to make it concrete here):
This has a wide range of consequences, one of them being that we can reveal the columns of $A$ by simply multiplying $A$ against each of the column vectors $(1,0,0)^T$, $(0,1,0)^T$ and $(0,0,1)^T$.
Suppose we form the matrix $P$ from the three column vectors in the question. It's easy to see that $P(1,0,0)^T = (1,1,0)^T$. Furthermore, (multiplying both sides on the left by $P^{-1}$) we have $(1,0,0)^T = P^{-1} (1,1,0)^T$.
[If you think about it, $P$ acts as a change of basis matrix converting from "basis-space" to the standard basis, and $P^{-1}$ is the reverse transformation, taking a vector and returning its components with respect to the basis.]
Now let's compute $P^{-1} A P$. By the prior observation we can compute this matrix one column at a time, by looking at say $x=(1,0,0)^T$ and computing $(P^{-1}AP)x$. By associativity this is the same as $P^{-1}(A(Px))$.
So since we computed that $(P^{-1}AP)(1,0,0)^T = (1,0,0)^T$, we in fact know the first column of the matrix $P^{-1}AP$, and it indeed matches the first column of $D$.
Now, can you compute the other two columns of $P^{-1}AP$ using these ideas?