In Priors for Infinite Networks (Neal, 1996), the paper considers a simple one hidden layer neural network defined by \begin{align}h &= \tanh (a + Ux) \\ f &= b + Vh\end{align} where $f(x)$ is the output.
I'm interested in the term $V(x) = \Bbb E[h_j(x)^2]$. In the paper it only says it's finite so that the CLT can be applied. Suppose $x$ is a fixed input and $a, U$ are defined as i.i.d. normally distributed weights, so that $z=a + Ux$ is normally distributed with zero mean. Then the question becomes: what is the variance of $\tanh z= \operatorname{E}[\tanh^2 z]$?
Maybe there's a nice integration trick using trig identities?
Related question: Integral (Tanh and Normal)
Let $z\sim{\cal N}(0,\sigma^2)$. We wish to evaluate $$\Bbb E[\tanh^2z]=\int_{-\infty}^\infty\frac{e^{-z^2/2\sigma^2}}{\sigma\sqrt{2\pi}}\tanh^2z\,dz=\frac1\sigma\sqrt{\frac2\pi}\int_0^\infty e^{-z^2/2\sigma^2}\tanh^2z\,dz.$$ Integrate by parts to get \begin{align}\int_0^\infty e^{-z^2/2\sigma^2}\tanh^2z\,dz&=\left.e^{-z^2/2\sigma^2}(z-\tanh z)\right\vert_0^\infty+\frac1{\sigma^2}\int_0^\infty ze^{-z^2/2\sigma^2}(z-\tanh z)\,dz\\&=\frac1{\sigma^2}\left(\int_0^\infty z^2e^{-z^2/2\sigma^2}\,dz-\int_0^\infty ze^{-z^2/2\sigma^2}\tanh z\,dz\right).\end{align} The first integral is a classic and is equal to $\sigma^3\sqrt{\pi/2}$. For the second integral, let $u=z^2$ so \begin{align}\int_0^\infty ze^{-z^2/2\sigma^2}\tanh z\,dz&=\frac12\int_0^\infty e^{-u/2\sigma^2}\tanh\sqrt u\,du.\end{align} This is the Laplace transform of $\tanh\sqrt u$ which has no elementary closed form. So at most, we can write \begin{align}\Bbb E[\tanh^2z]&=1-\frac1{\sigma^3\sqrt{2\pi}}{\cal L}\left[\tanh\sqrt t\right](1/2\sigma^2)=1-2\sum_{n=0}^\infty(-1)^n(f(n+1)-f(n))\end{align} where $f(x)=xe^{2\sigma^2x^2}\operatorname{erfc}(\sigma x\sqrt2)$.