The given exercise is, for all $\theta$ in $\mathbb{R}$, show that the matrix always has an eigenvector in $\mathbb{R^2}$ $$ A = \begin{pmatrix} \cos(\theta) & \sin(\theta) \\ \sin(\theta) & -\cos(\theta) \end{pmatrix} $$
The use of determinants isn't allowed, it is just allowed that I should use de definition of eigenvector and eigenvalue, and the fact that the eigenvectors are linearly independent if they are distinct (if needed).
I don't fully understand the problem because I can't see how is possible that this matrix can give me a multiple scalar of some vector.
AFTERTHOUGHT: it occurs to me that there is something that does not need determinant, although the concept is implicit. As you can easily confirm, we have a matrix $A$ such that $\color{red}{A^2 = I}.$ Now, if you are willing to accept the proposition that every square matrix has an eigenvalue (possibly complex) then we can write $$ Av = \lambda v $$ for $v$ a nonzero column matrix, possibly complex as well. Then $$ v = Iv = A^2 v = \lambda Av = \lambda^2 v. $$ So, in fact, $\lambda = \pm 1.$ This argument holds for things that are not reflections or symmetric; consider $$ A = \left( \begin{array}{cc} 0 & 7 \\ \frac{1}{7} & 0 \end{array} \right) $$ Also, note that possession of an eigenvalue is not automatic in infinite dimension. From what I can see, in the space of one-sided infinite sequences with, say, complex entries, the right-shift operator does not have any non-zero eigenvector. See http://en.wikipedia.org/wiki/Shift_operator#Sequences
ORIGINAL: People seem confused. This matrix gives a REFLECTION. Determinant is $-1,$ trace is $0,$ so, no matter what $\theta$ might be, the characteristic polynomial is $$ \lambda^2 - 1 $$ and the eigenvalues are $1$ and $-1.$ there is no counterexample.
It saves a little writing if you write the matrix as $$ \left( \begin{array}{cc} a & b \\ b & -a \end{array} \right) $$ with the understanding that $$ a^2 + b^2 = 1. $$ So, the $+1$ eigenvector is a column vector sent to the zero vector by $$ \left( \begin{array}{cc} a -1 & b \\ b & -a -1 \end{array} \right). $$ Think about it. With $a^2 + b^2 = 1,$ why is the matrix immediately above singular? What is the actual eigenvector in terms of $a,b?$
Similar, the $-1$ eigenvector is a column vector sent to the zero vector by $$ \left( \begin{array}{cc} a +1 & b \\ b & -a +1 \end{array} \right). $$ Again, why is this singular? Oh, symmetric matrix, the eigenvectors have different eigenvalues and are perpendicular to each other.
You can tell that a matrix is a reflection if it can be written as $\color{red}{I - 2 v v^T},$ where the letter $v$ refers to a column vector of length $1.$ Oh, in the other order, $v^T v$ is the (squared) length of $v,$ it is a 1 by 1 matrix with entry $v \cdot v.$ In the order used above, $v v^T$ is a symmetric, rank one, positive semidefinite matrix, furthermore its trace is exactly $1.$ So, the determinant of $\color{red}{I - 2 v v^T}$ is $-1$ and its trace is $n-2,$ where $n$ is the dimension. In dimension 2, you need only check the trace and determinant to know for sure.