Proving how to reduce a Brownian walk on a plane to a line (2D to 1D)

203 Views Asked by At

I have a Brownian motion on a plane and would like to find the time of when it is expected to hit a set of parallel lines, i.e the hitting time. In order to do so, I understand that I can reduce the problem to a single dimension in order to simplify calculations, after which the Brownian motion will become a one-dimensional random walk I think. But how can I prove that this is possible? Reading on the Internet, I saw that Brownian motions are invariant under rotations (although I have no proof of that) and that I can possibly use an orthogonal projection (not sure if that is true or applicable, maybe it was just for finding the hitting time when circles are involved). Can someone illustrate how this works?