Here is a cute problem; I am looking for a straightforward proof.
Consider you are in a car that drives on a one-way street, and its speed is a random function of time. You are given a probability distribution for the speed of the car at any given time. This distribution is stationary in time and independent of the position.
Now consider an observer sitting at a fixed position on the road measuring the speed at which you cross the position. What is the probability distribution of your speed as measured by this observer?
If the probability density function of the speed at a given time is given by $f(v)$ with some average value $\bar v\equiv \int_0^\infty v f(v)\, dv$, the answer (the probability density function $g(v)$ of speed as measured by the observer at a fixed position) is given by $$ g(v) = \frac{v}{\bar v} f(v). $$ This says that the probability of observing a particular speed $v$ at a given position is skewed toward faster speeds.
Since the solution looks simple and general, I think there should be a straightforward derivation for this. I am looking for a derivation that is clear but simple (for someone who understands probability but has not taken a formal measure-theoretic course on probability, think of scientists and engineers).
Edit: Let me clarify some things that showed up in the comments. For the application I have in mind, the car speed is a continuous-time stationary stochastic process (I'm hoping this detail isn't needed for the proof, but just in case). The speed can be a continuous or discontinuous function of time. The position is not random. The statement of the problem holds for any fixed position. Since as Henry mentioned in the comments, this really looks like it should be a simple change of variable, I am looking for a straightforward derivation that is also general.
You are more likely to be at a spot where the car is faster since the car covers longer intervals when it is faster--assuming your location is uniformly chosen along the way.