I have a system
$$ \begin{cases} x_1'(t) = -x_2(t) \\ x_2'(t) = -x_1(t) \end{cases} $$
I have found the linearly independent solutions
$$ \begin{split} \vec{x}(t) &= e^{\lambda_1 t} \vec{v}_1 = e^t \begin{pmatrix}-1 \\ 1\end{pmatrix}, \\ \vec{x}(t) &= e^{\lambda_2 t} \vec{v}_2 = e^{-t} \begin{pmatrix}1 \\ 1\end{pmatrix}. \end{split} $$
and the general solution
$$ \vec{x}(t) = c_1 e^t \begin{pmatrix}-1 \\ 1\end{pmatrix} + c_2 e^{-t} \begin{pmatrix}1 \\ 1\end{pmatrix} $$
Is it correct that both all the linearly independent solutions and the general solution are all named $\vec{x}(t)$ or should the linearly independent solutions be named $\vec{x}_1(t)$ and $\vec{x}_2(t)$?
Now I have to find the solution to which $x_1(0) = -1$ and $x_2(0) = 1$. How can I do this? I know I have to choose the correct values of $c_1$ and $c_2$ in order to fulfil the conditions. But I don't know where to substitute $x_1(0) = -1$ and $x_2(0) = 1$.
In my system, I have $x_1'(t)$ and $x_2'(t)$ but these are not vector function although I only have found vector functions as solutions. Is it because my general solution $\vec{x}(t)$ consists of the functions $x_1(t) = -c_1 e^t + c_2 e^{-t}$ and $x_2(t) = c_1 e^t + c_2 e^{-t}$, so I just have to find $c_1$ and $c_2$ such that $x_1(0) = -c_1 e^0 + c_2 e^0 = -c_1 + c_2 = -1$ and $x_2(0) = c_1 e^0 + c_2 e^0 = c_1 + c_2 = 1$, resulting in $c_1 = 1$ and $c_2 = 0$?
Apart from some confusing notation with indices $(1,2)$ - better replace that with e.g. $(a,b)$ - you've got everything OK. About the initial conditions: $$ \vec{x}(t) = c_1 e^t \begin{pmatrix}-1 \\ 1\end{pmatrix} + c_2 e^{-t} \begin{pmatrix}1 \\ 1\end{pmatrix} \quad \Longrightarrow \quad \vec{x}(0) = c_1 \begin{pmatrix}-1 \\ 1\end{pmatrix} + c_2 \begin{pmatrix}1 \\ 1\end{pmatrix} = \begin{pmatrix}-1 & 1\\ 1 &1 \end{pmatrix}\begin{pmatrix}c_1 \\ c_2\end{pmatrix} $$ $$ \begin{pmatrix}-1 & 1\\ 1 &1 \end{pmatrix}\begin{pmatrix}c_1 \\ c_2\end{pmatrix} = \begin{pmatrix} x_1(0) \\ x_2(0) \end{pmatrix} = \begin{pmatrix} -1 \\ 1 \end{pmatrix} \quad \Longrightarrow \quad \begin{pmatrix}c_1 \\ c_2\end{pmatrix} = \begin{pmatrix}-1 & 1\\ 1 & 1 \end{pmatrix}^{-1} \begin{pmatrix} -1 \\ 1 \end{pmatrix} $$ Inverse of an almost orthogonal matrix is almost the transpose, so: $$ \begin{pmatrix}c_1 \\ c_2\end{pmatrix} = \frac{1}{2} \begin{pmatrix}-1 & 1\\ 1 & 1 \end{pmatrix} \begin{pmatrix} -1 \\ 1 \end{pmatrix} \quad \Longrightarrow \quad \begin{pmatrix}c_1 \\ c_2\end{pmatrix} = \begin{pmatrix} 1 \\ 0 \end{pmatrix} $$