Chernoff's inequality states that, for any random variable $y$, $$ p(y\geq t) <= \exp(-\phi_y^*(t))\\ \text{where } \phi_y^* = \sup\limits_{\lambda \geq 0} \lambda t-\log({\bf E}(\exp(\lambda y))) $$
Now, in case of normal distribution with 0 mean and $\sigma^2$ variance, the book says $\log({\bf E}(\exp(\lambda y))) = {\lambda^2\sigma^2}/{2}$.
I did not get this part. How they derived the expression for ${\bf E}(\exp(\lambda y))$
Let $X$ be a Gaussian distribution with mean 0 and variance 1, i.e. with density $f(x)=\frac{1}{\sqrt{2\pi}} e^{-\frac{x^2}{2}}$. Then $$ \mathbb{E}[e^{\lambda X}] = \int_\mathbb{R} e^{\lambda x} \frac{1}{\sqrt{2\pi}} e^{-\frac{x^2}{2}} dx = e^{\frac{\lambda^2}{2}} \int_\mathbb{R} \frac{1}{\sqrt{2\pi}} e^{-\frac{\lambda^2}{2} +\lambda x -\frac{x^2}{2}} dx\\ = e^{\frac{\lambda^2}{2}} \int_\mathbb{R} \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2}(x -\lambda)^2} dx = e^{\frac{\lambda^2}{2}} \int_\mathbb{R} \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2} y^2} dy = e^{\frac{\lambda^2}{2}}$$ Where I used the change of variables $y=x-\lambda$ and the fact that $$\int_\mathbb{R} \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2} y^2} dy =1$$ Now in the case instead that $X$ has variance $\sigma^2$, recall that it can be written as $X=\sigma Z$ where $Z$ is a Gaussian with variance 1, therefore $$\mathbb{E}[e^{\lambda X}] = \mathbb{E}[e^{\lambda\sigma Z}] = e^{\frac{\lambda^2\sigma^2}{2}}$$ by the previous calculation.