Given $A\in\mathbb{R}^{n\times n}$,$B\in\mathbb{R}^{n\times m}$ and $C\in\mathbb{R}^{m\times n}$ such that $\det(j\omega I-A)\neq0$, find all real frequencies $\omega$ such that
$$\det(C(j\omega I-A)^{-1}B)=0.$$
Here $j=\sqrt{-1}$ and $I$ is an identity matrix of an appropriate size.
If $m=n$, then
\begin{align} \det(C(j\omega I-A)^{-1}B)=\det(BC(j\omega I-A)^{-1})=\det(BC)\det(j\omega I-A)^{-1}=0. \end{align} Since $\det(j\omega I-A)^{-1}\neq0$, we get $\det(BC)=0$ which doesn't depent on $\omega$, thus
if $\det(BC)=0$, then $$\det(C(j\omega I-A)^{-1}B)=0\text{ for any }\omega\in(-\infty,+\infty),$$
else if $\det(BC)\neq0$, then
$$\det(C(j\omega I-A)^{-1}B)=0\text{ for any }\omega\in\{\emptyset\}.$$
If $m\neq n$, we don't have $\det(C(j\omega I-A)^{-1}B)=\det(BC(j\omega I-A)^{-1})$, thus I couldn't find an answer.
What you are trying to calculate is also called the transmission zeros (although, you are limiting to the ones that only have zero real part). These can also be calculated using the following generalized eigenvalue problem. When also including the feedthrough matrix $D$ then this problem can be written as
$$ \begin{bmatrix} A & B \\ C & D \end{bmatrix} v = \lambda \begin{bmatrix} I & 0 \\ 0 & 0 \end{bmatrix} v, $$
with $(A,B,C,D)$ the matrices associated with the following linear time invariant state space model
\begin{align} \dot{x}(t) &= A\,x(t) + B\,u(t), \\ y(t) &= C\,x(t) + D\,u(t), \end{align}
which has the associated transfer function
$$ G(s) = C \left(s\,I - A\right)^{-1} B + D. $$
However, if you prefer to you can also just set $D=0$, but I thought I would try to generalize my answer slightly.
When solving for $\lambda$, the generalized eigenvalue problem is equivalent to
$$ \det\left( \begin{bmatrix} A - \lambda\,I & B \\ C & D \end{bmatrix} \right) = 0. \tag{1} $$
With the help of the Schur complement the matrix inside the determinant can also be written as
$$ \begin{bmatrix} A - \lambda\,I & B \\ C & D \end{bmatrix} = \underbrace{ \begin{bmatrix} I & 0 \\ C \left(A - \lambda\,I\right)^{-1} & I \end{bmatrix} }_{U} \underbrace{ \begin{bmatrix} A - \lambda\,I & 0 \\ 0 & C\left(\lambda\,I - A\right)^{-1}B+D \end{bmatrix} }_{M} \underbrace{ \begin{bmatrix} I & \left(A - \lambda\,I\right)^{-1}B \\ 0 & I \end{bmatrix} }_{V}. $$
For $\lambda$ not an eigenvalue of $A$ the matrices $U$ and $V$ are well defined, square and always full rank (due to the block triangular structure and the identity matrices along the diagonal). So the only way for $(1)$ to be true requires that $\det(M) = 0$ and because of the block diagonal structure of $M$ this is satisfied for $\det(C\left(\lambda\,I - A\right)^{-1}B+D) = 0$.
As a side note I haven't really found a justification why the eigenvalues of $A$ also don't solve $(1)$, since the structure of $M$ would imply so. Probably the interaction with the singularities in $U$ and $V$ negates this, but I might be wrong. But this still hopefully illustrates that solving $(1)$ does finds $\lambda$ such that $\det(C\left(\lambda\,I - A\right)^{-1}B+D) = 0$. And to find solution to your question, with only $\lambda = j\,\omega$, one should then only consider the solutions to $(1)$ for $\lambda$ which have zero real part.