I would appreciate some hints or guidance towards solving the following exercise:
Let $\left\{ S\left(j\right)\thinspace:\thinspace j=0,1,\ldots\right\}$ be a simple random walk on the integers started at $S\left(0\right)=0$. Show that as $n\uparrow\infty$ :$$\frac{1}{n^{2}}\min\left\{ j\thinspace:\thinspace\left|S\left(j\right)\right|=n\right\} \overset{d}{\longrightarrow}\min\left\{ t\geq0\thinspace:\thinspace\left|B\left(t\right)\right|=1\right\}$$ where $\left\{ B\left(t\right)\thinspace:\thinspace t\geq0\right\}$ is a $1$-dimensional Brownian motion.
Basically I have no good ideas on where to start so I would really appreciate some solid pointers or even better yet a guide towards a solution. I'm willing to offer a bounty should that help.
I will follow the outline that I described in my first comment. Let's start with random walk. Fix $\alpha \in \mathbf{R}$ and consider the process $(M_j)$ defined as $$M_j = (\cosh{\alpha})^{-j}\cosh{(\alpha S_j)}$$ We want to show that $(M_j)$ is a martingale with respect to the natural filtration $(\mathcal{F}_j)$ generated by the constituent random variables in $S$. By that I mean $\mathcal{F_j} = \sigma(X_1,\ldots, X_j)$ and $S_j = X_1 + \cdots + X_j$. I took the liberty of assuming that $P\{X_i = 1\} = P\{X_i = -1\} = \frac{1}{2}$ and defined $S_0 = 0$. Of course $X_1,X_2,\ldots$ is an IID sequence. \begin{align}E[M_j\mid \mathcal{F}_{j-1}] =& (\cosh{\alpha})^{-j}E[\cosh{(\alpha S_j)}\mid \mathcal{F}_{j-1}]\\ =& \frac{1}{2}(\cosh{\alpha})^{-j}E[\exp{(\alpha S_j)}+\exp{(-\alpha S_j)}\mid \mathcal{F}_{j-1}]\\ =&\frac{1}{2}(\cosh{\alpha})^{-j}E[\exp{(\alpha S_{j-1} + \alpha X_j)}+\exp{(-\alpha S_{j-1} -\alpha X_j)}\mid \mathcal{F}_{j-1}]\\ =&\frac{1}{2}(\cosh{\alpha})^{-j}\left(\exp{(\alpha S_{j-1})} E[\exp{(\alpha X_j)}]+\exp{(-\alpha S_{j-1})} E[\exp{(-\alpha X_j)}]\right)\\ =&\frac{1}{2}(\cosh{\alpha})^{-j}\left(\exp{(\alpha S_{j-1})}\cosh{\alpha}+\exp{(-\alpha S_{j-1})}\cosh{\alpha}\right)\\ =&(\cosh{\alpha})^{-(j-1)}\cosh{(\alpha S_{j-1})} =: M_{j-1} \end{align}
We want to stop this process at the hitting time $\tau_n:=\inf{\{j: \lvert S_j\rvert = n\}}$, which is a stopping time. Furthermore, it is almost surely finite for each $n$. I will not prove these facts here. Fix $n$. Doob's optional stopping theorem gives $$E[M_{\tau_n\wedge j}] = E[M_0] = 1$$ Since $\tau_n$ is a.s. finite, $M_{\tau_n\wedge j} \rightarrow M_{\tau_n}$ a.s. Furthermore, $M_{\tau_n\wedge j} \leq \cosh{(\alpha n)}$. Since $E[\cosh{(\alpha n)}] < \infty$, the dominated convergence theorem can be applied to obtain $$E[M_{\tau_n}] = 1$$ Since $\tau_n$ is a.s. finite $\lvert S_{\tau_n}\rvert = n$. This implies $M_{\tau_n} = (\cosh{\alpha})^{-\tau_n}\cosh{(\alpha S_{\tau_n})} = (\cosh{\alpha})^{-\tau_n}\cosh{(\alpha n)}$ because the hyperbolic cosine is an even function. Hence $$E\left[\left(\frac{1}{\cosh{\alpha}}\right)^{\tau_n}\right] = \frac{1}{\cosh{(\alpha n)}}$$ Now define $\lambda \geq 0$ such that $\exp{(-\frac{\lambda}{n^2})} = \frac{1}{\cosh{\alpha}}$. Then we get $$E[e^{-\lambda\frac{\tau_n}{n^2}}] = \frac{1}{\cosh{(n\cosh^{-1}{(e^{\frac{\lambda}{n^2}})})}}$$
For the inverse hyperbolic cosine it doesn't matter which value you take (positive or negative) since it is inside another hyperbolic cosine function. So we have found the moment generating function of $\frac{\tau_n}{n^2}$. I will cheat here a bit and use an online tool to compute the limit of $\frac{1}{\cosh{(n\cosh^{-1}{(e^{\frac{\lambda}{n^2}})})}}$ as $n\rightarrow\infty$. The answer is $$\lim_{n\rightarrow\infty}\frac{1}{\cosh{(n\cosh^{-1}{(e^{\frac{\lambda}{n^2}})})}} = \frac{1}{\cosh{(\sqrt{2\lambda})}}$$
So next we look at the hitting time for Brownian motion. The procedure is exactly the same. Start with the process $X_t = \cosh{(\alpha B_t)}\exp{(-\alpha^2\frac{t}{2})}$ and go through the exact same steps as above. For the hitting time $\sigma := \inf{\{t\geq 0: \lvert B_t \rvert = 1 \}}$ we have $$E[e^{-\lambda\sigma}] = \frac{1}{\cosh{(\sqrt{2\lambda})}}$$
So the moment generating function for scaled hitting time of random walk converges to the moment generating function for hitting time of Brownian motion. Since the latter is continuous everywhere, convergence in distribution follows. This completes the proof.
P.S. For the computation of the limit I made use of the following:
https://www.wolframalpha.com/input/?i=limit+1%2Fcosh%28n*sqrt%28lambda%29*arccosh%28exp%281%2Fn^2%29%29%29+as+n-%3Einfinity
P.P.S Here is a computation of the limit. It is enough to show $$\lim_{x\rightarrow\infty}x\cosh^{-1}{(\exp{(\frac{\lambda}{x^2})})} = \sqrt{2\lambda}$$
First substitute $u = \frac{\lambda}{x^2}$. Then, the above limit is equal to
$$\lim_{u\rightarrow 0}\sqrt{\frac{\lambda}{u}}\cosh^{-1}{(e^u)}$$ L'Hopital's rule applies here. Differentiating both the numerator and the denominator with respect to $u$ (and doing some algebraic manipulations) we get $$\lim_{u\rightarrow 0}\sqrt{\frac{\lambda}{u}}\cosh^{-1}{(e^u)} = 2\sqrt{\lambda}\lim_{u\rightarrow 0}\sqrt{u + \frac{u}{e^{2u}-1}} = 2\sqrt{\lambda}\frac{1}{\sqrt{2}} = \sqrt{2\lambda}$$