I have a Hermitian matrix of size $n$, $H_{n\times n}$ with only off diagonal entries, actually all the entries are real in $H$ (very special case). Now my job is to find the anti-commuting matrix say $O_{n\times n}$ to $H_{n\times n}$,
$O H O^{-1} = -H $, and with $O^{2} = 1 $ which is unitary matrix
There are two cases, (i) when $n$ is odd, (ii) when $n$ is even.
(i) If $H$ has entries where only at the odd sum of indices($i+j$ $\epsilon$ odd) exists then we can find $O$,
e.g. $$H_{1} =\begin{bmatrix}0_{1,1} & a_{1,2} & 0_{1,3} \\a_{2,1} & 0_{2,2} & b_{2,3} \\ 0_{3,1} & b_{3,2} & 0_{3,3} \end{bmatrix}$$
$O = \{\{\sigma_{z},0\},\{0,1\}\} $, where $\sigma_{z}$ is Pauli Matrix
If we have element also at the even site $(1,3)$ and $(3,1)$, say $c$, then $$H_{2} =\begin{bmatrix}0_{1,1} & a_{1,2} & c_{1,3} \\a_{2,1} & 0_{2,2} & b_{2,3} \\ c_{3,1} & b_{3,2} & 0_{3,3} \end{bmatrix}$$ Then we can't find $O$.
Then the story is same for case (ii) even $n$.
Is there a way to show this?
(My background is in Physics, I'm not very good with maths. I strongly apologize for that). Both a sketch solution or a good starting point would help me.
I don't know if this will help you but here is a sufficient and necessary condition for the existence of the matrix $O$. The proof also provides a way of computing explicitely $O$.
Let's prove first that this is necessary.
Assume such a $O$ exists and let $\lambda$ be a non zero eigenvalue of $H$ with eigenvector $v \neq 0$, i.e. $Hv = \lambda v$. Then, $HOv = - OHv = -\lambda Ov$ and $Ov \neq 0$ because $O$ is invertible. As for the dimensions, notice that $O_{| E_{\lambda}} : E_{\lambda} \rightarrow E_{-\lambda}$ is an isomorphism between the finite dimensional vector spaces $E_{\lambda}$ and $E_{-\lambda}$.
Now, let's prove that this is sufficient.
Let $\{\lambda_1, -\lambda_1, ..., \lambda_r, -\lambda_r\}$ be the set of non zero, distinct, eigenvalues of $H$. We have $$\Bbb C^n = E_{\lambda_1} \oplus E_{-\lambda_1} \oplus ... \oplus E_{\lambda_r} \oplus E_{-\lambda_r} \oplus \ker H$$ since $H$ is diagonalizable (and we omit $\ker H$ from the direct sum if it is equal to $\{0\}$).
For $\lambda_i$, let $\{v^1_{\lambda_i}, ..., v^{k_i}_{\lambda_i}\} \subset \Bbb C^n$ be a basis for the eigenspace $E_{\lambda_i}$ and let $\{w^1_{\lambda_i}, ..., w^{k_i}_{\lambda_i}\} \subset \Bbb C^n$ be a basis for the eigenspace $E_{-\lambda_i}$.
Define $O : \Bbb C^n \rightarrow \Bbb C^n$ to be the linear application such that $$Ov = v \ \forall v \in \ker H \\ Ov^j_{\lambda_i} = w^j_{\lambda_i} \\ Ow^j_{\lambda_i} = v^j_{\lambda_i}$$ for all $i = 1, ..., r$, for all $j = 1, ..., k_i$
Such an application is well defined, linear, invertible, and satisfies $O \circ O = I$.
Let $O_E \in \Bbb C^{n \times n}$ be the matrix of $O$ with respect to the canonical basis $\{e_1, ..., e_n\}$ of $\Bbb C^n$. Then $O_E$ is invertible and satisfies $O_E^2 = I$.
Furthermore, $HO_E = -O_EH$. Indeed, you just have to check the equality for a basis of $\Bbb C^n$. It is really easy to check with the basis of $\Bbb C^n$ consisting of the elements $v^j_{\lambda_i}$, $w^j_{\lambda_i}$ as above and any basis of $\ker H$ since those elements are eigenvectors of $H$ and $O$ is defined with them.
Other necessary conditions
If $H$ is any $\Bbb C^{n \times n}$ matrix satisfying $HO = -OH$ for an invertible $\Bbb C^{n \times n}$ matrix $O$, then $OHO^{-1} = -H$, so $$\det{H} = (-1)^n \det{H}$$ so either $\det H$ is zero, or $n$ is even.
Furthermore, $tr(OHO^{-1}) = tr(OO^{-1}H) = tr(H) = -tr(H)$, so $tr(H) = 0$
Application to the $3 \times 3$ hermitian case
For any $3 \times 3$ hermitian matrix $$H = \begin{pmatrix}a & b & c \\ \overline{b} & a & d \\ \overline{c} & \overline{d} & a \end{pmatrix}, \ a \in \Bbb R$$ we can show that the characteristic polynomial of $H$ is given by $$p_H(x) = x^3 - tr(H) x^2 + ((a^2 - \lvert d \rvert^2) + (a^2 - \lvert c \rvert^2) + (a^2 - \lvert b \rvert^2)) x - \det H$$
As mentioned, the relation $HO = -OH$ with $O$ invertible implies that $H$ has determinant and trace zero.
Assuming $H$ satisfies those two assumptions ($\iff$ $a = 0$ and at least one of the elements $b,c,d$ is zero as Kolja mentioned), the polynomial becomes $$p_H(x) = x(x^2 - (\lvert d \rvert^2 + \lvert c \rvert^2 + \lvert b \rvert^2))$$ Then, the condition of our result is always satisfied and in that case, Kolja showed explicitely the existence of the matrix $O$.
Application to the tridiagonal case
Let $H$ be a $n \times n$ tridiagonal real matrix $$H = \begin{pmatrix}a & b & 0 & 0 & ... & \\ c & a & b & 0 & ... & \\ 0 & c & a & b &... & \\ & & \ddots & \ddots & \ddots & & \end{pmatrix}, \ a, b, c \in \Bbb R$$
The eigenvalues of such a matrix are given by $$a + 2 \sqrt{bc} \cos \left(\frac{k \pi}{n+1}\right), k = 1, ..., n$$
If $b$ and $c \neq 0$, then all the eigenvalues are distinct. In that case, if we assume $a = 0$ (so that we have $tr(H) = 0$ as needed), then the condition of our result is always satisfied.
I don't think there is an easy result for the general case involving only the indices of the matrix.