Let $N(y|x,\delta t)$ be the normal distribution pdf with mean $x$ and variance $\delta t$, where $\delta t$ should be understood as arbitrarily small. I'm seeking to approximate the following pdf:
$$ f(y;x,\delta t) = N(y|x,\delta t) e^{-\delta t/2} \frac{\cosh(y)}{\cosh(x)} $$
by another normal pdf $N(y|\mu,\sigma^2)$, where the approximation should be exact for order $\delta t$ and $\mu$ and $\sigma^2$ should be functions of $x$ and $\delta t$ only. One can verify that $f$ is indeed a properly normalized pdf, and if one plots it for arbitrary values of x and small enough $\delta t$, it looks indeed a lot like $N(y|x,\delta t)$, only slightly shifted. I don't know how to proceed to obtain the analytical expression of the approximation that I'm seeking.
I've found the answer. The key idea is to find the Taylor expansion of $L \equiv \log(f)$ around the maximum of $f$ (I'll refer to the point at which the maximum is attained as $y_0$, and the value of the maximum as $f(y_0)$). Then keeping the expansion until the second order, one is effectively finding the approximation of $f$ by a normal distribution with mean $\mu=y_0$ and variance
$$ \sigma^2 = \Biggl[-\frac{\partial^2 L}{\partial y^2}\bigg\rvert_{y_0} \Biggr]^{-1}$$
To see this, we can write the Taylor expansion of $L$ around $y_0$ until the second order
$$ L(y) \sim L(y_0) + \frac{\partial L}{\partial y}\bigg\rvert_{y_0}(y-y_0) + \frac{1}{2}\frac{\partial^2 L}{\partial y^2}\bigg\rvert_{y_0}(y-y_0)^2 $$
The first derivative is zero trivially because we are at the maximum. Taking that into account and exponentiating to undo the logarithm, we get
$$ f(y) \sim f(y_0) \exp\Bigl( \frac{1}{2}\frac{\partial^2 L}{\partial y^2}\bigg\rvert_{y_0}(y-y_0)^2 \Bigr)$$
The term $f(y_0)$ is a constant and won't tell us about the shape of $f$, but only play a role in the normalization. By comparing the expression above to the expression of a normal pdf
$$ N(y|\mu,\sigma^2) = \frac{1}{\sqrt{2\pi\sigma^2}} \exp\Bigl( -\frac{(y-\mu)^2}{2\sigma^2} \Bigr)$$
We immediately verify the expressions above.
So now going to our concrete example, we need to find first the maximum of $f$, and we will do so by finding the point at which $\frac{\partial L}{\partial y}=0$. Let us write $L$ first
$$ L = -\frac{(y-x)^2}{2\delta t} + \log(\cosh(y)) -\frac{1}{2}\log(2\pi\delta t) -\frac{\delta t}{2} - \log(\cosh(x)) $$
$$ \frac{\partial L}{\partial y} = -\frac{(y-x)}{\delta t} + \tanh(y) $$
$$ \frac{(y_0-x)}{\delta t} - \tanh(y_0) = 0 $$
Introducing now the change of variables
$$ \begin{cases} x' &= \frac{x}{\delta t} \\ y' &= \frac{y}{\delta t} \end{cases} $$
Then we can write
$$ y_0'-x' - \tanh(\delta t \, y_0') = 0 $$
And expanding the hyperbolic tangent to order $\delta t$ we have
$$ y_0'-x' - \delta t \, y_0' = 0 $$
Which solving and undoing the change of variables gives us
$$ y_0 = \frac{x}{1-\delta t} \sim x(1+\delta t) $$
Now, to calculate the variance, the second derivative is
$$ \frac{\partial^2 L}{\partial y^2} = -\frac{1}{\delta t} + \mathrm{sech}^2(y) $$
Therefore
$$ \sigma^2 = \Biggl[-\frac{\partial^2 L}{\partial y^2}\bigg\rvert_{y_0} \Biggr]^{-1} = \frac{\delta t}{1 - \delta t\mathrm{sech}^2(x(1+\delta t))} = \delta t + \delta t^2 \mathrm{sech}^2(x(1+\delta t)) $$
So keeping the expressions at order $\delta t$ we just found
$$ \begin{align} \mu &= y_0 = x(1+\delta t) \\ \sigma^2 &= \delta t \end{align} $$
And to summarize
$$ f(y;x,\delta t) = N(y|x,\delta t) e^{-\delta t/2} \frac{\cosh(y)}{\cosh(x)} \sim N(y|x(1+\delta t),\delta t) $$