Questions regarding Brownian Motion:
Question 1a: Prove that $E\left(exp\left( iuB_{t}\right)\right)=exp\left(-\dfrac{1}{2}u^{2}t\right)$
Let $B_t$ be a Brownian Motion on $\mathbb{R}$, $B_0$ = 0 and $E=E^{0}$
What we know:
$B_t$ is a Gaussian process. $Z =\left(B_{t_{1}},...,B_{t_{k}}\right)\in\mathbb{R}^{nk}$
This means that there exists a vector $M\in\mathbb{R}^{nk}$ and a non-negative definite matrix $C=[c_{jm}]\in\mathbb{R}^{nk*nk}$ such that
$E^{x}\left[ exp\left( i\sum ^{nk}_{j=1}u_{j}Z_{j}\right) \right] =exp\left( -\dfrac {1}{2}\sum _{j,m}u_{j}c_{jm}u_{m}+i\sum _{j}u_{j}M_{j}\right) $ ------ equation 1
Where C is the covariance matrix:
Now from this I know that second term in right part of equation turns to zero due to $E=E^{0}$
$E^{0}\left[ exp\left( iuB_{t}\right) \right] = exp\left( -\dfrac {1}{2}u^2cov(B_t) \right) where $ $Cov(B_t)$=t.
$Variance=E\left[ \left( B_{t}-E\left[ B_{t}\right] \right) ^{2}\right]$ Expanding this and using the fact that expected value of $B_t$ = 0 I get that the Variance is the $E\left[ B_{t}^2\right]$ which is the $cov(B_t)$ and from the definition of the $B_t$ with a pdf:
Fix $x\in \mathbb{R}^{n}$ and define:
$p\left( t,x,y\right) =\left( 2\pi t\right) ^{-\dfrac {n}{2}}\cdot exp\left( -\dfrac {\left| x-y\right| ^{2}}{2t}\right) $
we have a variance of t.
Please let me know if this is incorrect!
Question 1b Prove that $E\left[B^{4}_{t}\right]=3t^{2}. $By using the power series expansion of the exponential function on both sides of equation 1. I have no issues proving this with the use of ito's lemma, however, I manage to get stuck when using the power series.
$\sum ^{\infty }_{k=0}\dfrac {i^{k}}{k!}E\left[ B^{k}\right] _{t}u^{k}=\sum ^{\infty }_{k=0}\dfrac {1}{k!}\left( -\dfrac {t}{2}\right) ^{k}u^{2k}$
But how do I solve for $E[B^{2k}_t]$ as a more general form:
Question 1c Given (2.2.2)
Now this I am completly stuck on. Please let me know if I have done anything wrong in my proofs, as well as provide the answers where I am completly stuck. Thank you very much for all the help.


Question 1a.
The required complexity of this question really depends on the construction of Brownian motion you were given in your lectures,or in the book you're following.
If you define Brownian motion using the standard characterisations (sometimes referred to as the Lévy construction, though this can be confused with Lévy's martingale characterisation), then you have as a given that when started from $0$
$$B_t \sim N(0,t).$$
In which case, you are really just trying to prove that the characteristic function is given by
$$ \mathbf E\left[ \exp(i u X)\right] = \exp \left( - \frac{\sigma^2}{2} u^2\right), \qquad X \sim N(0,\sigma^2).$$
Personally I think this is simpler than invoking the machinery of Gaussian processes, particularly proving the above property is simpler than proving the fact you rely on about lienar combinations of Gaussians.
I do, however, appreciate that you may have been taught to use a Gaussian process starting point for describing Brownian motion.
Question 1b.
For this we note that by definition of the exponential function, and linearity of expectation
$$ \mathbf E \left[ \exp(i u X) \right] = 1 + iu \mathbf E[X] + \frac{(iu)^2}{2!} \mathbf E[X^2] + \frac{(iu)^3}{3!}\mathbf E[X^3] + \cdots,$$ and hence
$$\mathbf E[X^n] = (-i)^n\left( \frac{d^n}{du^n}\mathbf E[ \exp(iu X)]\right)_{\big| u = 0}.$$
Therefore, given that $\mathbf E[\exp(i u B_t)] = \exp( - \frac12 tu^2)$ it suffices to calculate
$$\mathbf E[B_t^4] = \left( \frac{d^4}{du^4} \exp\left( - \frac12 t u^2 \right) \right)_{\big| u = 0}$$
The actual calculation is slightly tedious. Denoting $f(u) = \exp( - \frac12 t u^2)$, then
\begin{align*} f^{(1)}(u) & = -tu\, f(u) \\ f^{(2)}(u) & = (-t + t^2 u^2) f(u) \\ f^{(3)}(u) & = ( 3t^2u - t^3u^3) f(u) \\ f^{(4)}(u) & = (3t^2 - 6t^3u^2 + t^4 u^4)f(u). \end{align*} Substituting $u = 0$, the final line admits $f^{(4)}(0) = 3t^2$
Question 1c. In your post, you include two equations, one of which comes after the line Given (2.2.2), and the second is labeled (2.2.2). I would suggest that the former is the more useful equation in this context. I.e. we will use the fact that
$$ \mathbf E[ B_t^{2k}] = \frac{1}{\sqrt{2t\pi}}\int x^{2k} e^{-x^2/(2t)} dx$$
We will use induction (you will need to check the base case), and consider $\mathbf E[B_t^{2(k+1)}]$. We will apply integration by parts with $u = x^{2k + 1}, \, dv = x e^{-x^2/(2t)}$
\begin{align*} \mathbf E[B_t^{2(k+1)}] & = \frac{1}{\sqrt{2t\pi}}\int x^{2(k+1)} e^{-x^2/(2t)} dx \\ & = \frac{1}{\sqrt{2t\pi}}\int x^{2k+1} \left( x e^{-x^2/(2t)}\right) dx \\ & = \frac{1}{\sqrt{2t\pi}} \left( \left[ - x^{2k+1}te^{-x^2/(2t)} \right]_{x = -\infty}^\infty + (2k+1)t \int x^{2k} e^{-x^2/(2t)} dx \right) \\ & = \frac{(2k+1)t}{\sqrt{2t\pi} } \int x^{2k} e^{-x^2/(2t)} dx \\ & = (2k+1)t \mathbf E[B_t^{2k}]. \end{align*} Given that $\mathbf E[B_t^0] = 1, \, \mathbf E[B_t^2] = t$ then $$\mathbf E[B_t^{2k}] = \begin{cases} 1 & \text{if k = 0,}\\ t^k \prod_{j=0}^{k} (2j+1) & \text{if k $\geq$ 1}. \end{cases}$$
The product term in the above is also known as the double factorial, and in this notation the above can be written as
$$\mathbf E[B_t^{2k}] = t^k (k-1)!!,$$
where we note the slightly counter intuitive, though correct, fact that $(-1)!! = 1$ covers the case that $k = 0$.