I am really stuck in this question:
Let $\{S_t\}$ and $\{S'_t\}$ be two stochastic processes, satisfying \begin{equation} dS_t = S_t ( \sigma_t \,dB_t + r_t \,dt), \quad dS'_t = S'_t (\sigma'_t \,dB_t + r_t \,dt), \end{equation} where $\sigma$, $\sigma'$ and $r$ are adapted processes, and $B$ is a common standard Brownian motion.
$1.$ Assuming that the processes $\sigma$ and $\sigma'$ are both constant, prove that for each $t>0$, $S_t$ and $S'_t$ are perfectly correlated ( correlation $=1$) if and only if $\sigma = \sigma'$.
$2. $ Assuming that $r=0$ and that $\sigma$ and $\sigma'$ are bounded, prove that if $S_0 = S'_0>0$, and $S_1$ and $S'_1$ are perfectly correlated, then $\int_{0}^{1} (\sigma_t - \sigma'_t )^2 \,dt =0$ a.s..
$3.$ Without imposing any condition on $\sigma$ and $\sigma'$ and suppose that they are different, i.e. $\mathbb{P} ( \int_0^t ( \sigma_s - \sigma'_s )^2 \,ds =0) <1$. Can $S_t$ and $S'_t$ be perfectly correlated for each $t>0$ ?
(I only know that for any two random variables $X$ and $Y$, they are perfectly correlated if and only if $X= aY$, for some $a>0$. However, I don't know how to use this fact.)