How do I solve something like:
$$f(x) = \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^\infty e^{\frac{-(y - x/2)^2}{2}}f(y)\:\mathrm{d}y$$
for $f(x)$?
Is there also a general formula that this falls under? The closest thing I found was the Fredholm integral equation, but those (I believe) assume that the eigenfunction is linear.
In case it helps, here's my motivation for this problem: I'm trying to find the stationary distribution of a continuous state, discrete time Markov process. I came up with this transition function:
$$P(s_i = x | s_{i-1} = y) = p(x,y) = \frac{1}{\sqrt{2 \pi}} e^{\frac{-(x - y/2)^2}{2}}$$
The idea is that the probability function always brings the mean closer towards $0$ by using a Normal distribution with mean $\mu = y/2$. I then tried to solve the stationary distributions as follows
$$\pi_x = \int_{-\infty}^{\infty}p(y, x)\pi_y \:\mathrm{d}y$$
At this point, I'm stuck. (I got the original equation by setting $\pi_x = f(x)$ for clarity.) This might deserve its own questions, but does solving for stationary distributions have its own technique?
First of all, notice that your equation is linear, hence it either has a unique (zero) solution, or infinitely many of them which you can obtain by scaling/taking linear combinations. Of course, in your case you may be only interested in probability distributions, yet the original equation is given in the form $f = Af$ where $A$ is not contractive (and in fact has an eigenvalue of $1$), so I'm not sure there are general methods to solve such problems in Fredholm theory. Unless you know how to incorporate the condition $\int f = 1$ in a smart way.
This issue would be present in all problems where you want to find stationary distribution, so I would suggest look for probabilistic methods rather than methods from analysis. For example, your MC can be represented as $$ x_{n+1} = \frac12 x_n + \xi_n $$ where $\xi_n$ are iid standard Gaussian random variables. Now, can you express $x_{n+1}$ through $x_0$ and $\xi_0,\dots,\xi_n$? What is the limit when $n\to\infty$, does the distribution of $x_0$ matter? Another idea is that whenever you have linear Gaussian dynamics, I'd try $f$ being Gaussian with unknown parameters, and check whether you can find values of parameters that solve the problem.
Updated: To address your latter point, everything depends on the source of the problem. If your integral equation comes from some probabilistic problem (as in the case of OP) you can do at least two things:
Forget the probabilistic underlying and just try to solve the equation using some general methods. I guess as in theory of Fredholm equations you could try out some Laplace/Fourier transform to get a nicer equation and solve it. However, as I said, one of the drawbacks of such approach is that invariant distribution equations always have an infinite number of solutions which makes perfect sense from the probabilistic perspective, but I'm not sure whether general methods of solving integral equations are fine-tuned to such setting.
Try to solve it using probabilistic methods: not only it is more "natural" and hence you have intuition of what's going on at each step, also as you could notice such solution can be much faster/simpler (compare with the one outlined in an answer by DanielV).
Note that you can always try to reduce any linear integral equation $f = \int Kf $ to a problem of finding invariant distribution of some Markov Chain, however there is no guarantee such chain will exist: for that $K$ must be a stochastic kernel - for example, in your example of $xy/2$ you'd have to normalize it to integrate to $1$. In your original case everything worked out since the source of the problem was probabilistic. Now, if your question is how to obtain a dynamic representation via iid $\xi$ given a stochastic kernel, yet again although such representation always exists but the proof of this fact is not constructive, so in general you just need to try to find it.