Let $(U,V)$ be a pair of random variables. I am interested in the following quantity: \begin{align} \mathsf{E}[ \log ( \mathsf{P}[U=a \mid V] ) \mid U=a] \end{align} where $a$ is some fixed constant. For simplicity, assume $U$ is a discrete random variable.
We know that log is concave so if we use Jensen's inequality we have that \begin{align} \mathsf{E}[ \log ( \mathsf{P}[U=a \mid V] ) \mid U=a] &\le \log ( \mathsf{E}\left[ \mathsf{P}[U=a \mid V] \mid U=a\right] ) \\ &=\log ( \mathsf{P}\left[U=a \mid U=a\right] )\\ &=\log ( 1 )=0 \end{align}
Using Jensen's inequality we can also arrive at the following lower bound \begin{align} \mathsf{E}[ \log ( \mathsf{P}[U=a \mid V] ) \mid U=a] &= \mathsf{E}[ \log ( \mathsf{E}[1_{ \{U=a \}} \mid V] ) \mid U=a]\\ &\ge \mathsf{E}[ \mathsf{E}[ \log ( 1_{ \{U=a \}} ) \mid V]\mid U=a]\\ &= \mathsf{E}[ \log ( 1_{ \{U=a \}}) \mid U=a]\\ &=0. \end{align}
Using the two bounds we conclude that \begin{align} \mathsf{E}[ \log ( \mathsf{P}[U=a \mid V] ) \mid U=a]=0. \end{align}
However, I think there must be something wrong here. I don't think this is correct, but I cannot find an error.
Trace through your argument under the simplifying assumption that $V$ is a constant random variable. You'll find that the following assertion is not true:
$$\log(\mathsf{E}\left[ \mathsf{P}[U=a \mid V] \mid U=a\right] ) =\log ( \mathsf{P}\left[U=a \mid U=a\right] )\tag1$$ since the LHS equals $\log[ P(U=a)]$ when $V$ is constant; and $$\mathsf{E}[ \mathsf{E}[ \log ( 1_{ \{U=a \}} ) \mid V]\mid U=a] = \mathsf{E}[ \log ( 1_{ \{U=a \}}) \mid U=a]\tag2$$ is also false since the LHS equals $-\infty$ when $V$ is constant and $P(U=a)>0$.