Problem: Let $\mu_n$ be the Gaussian distribution with mean zero and variance $1/n$ on $\mathbb{R}$. Show that $\mu_n$ satisfies a large deviation principle with rate $1/n$ and rate function $x^2/2$.
Definition: The sequence of probability measures $\{\mu_n\}_{n\geq 1}$ is said to satisfy a large deviation principle with rate $\epsilon_n$ and rate function $I$ if for all $\Gamma\in\mathscr{B}$, $$-\inf_{x\in\Gamma^\circ} I(x)\leq \liminf_{n\rightarrow\infty}\epsilon_n\log\mu_n(\Gamma) \leq \limsup_{n\rightarrow\infty}\epsilon_n\log\mu_n(\Gamma)\leq -\inf_{x\in\overline{\Gamma}}I(x),$$ where $\Gamma^\circ$ is the interior of $\Gamma$ and $\overline{\Gamma}$ is the closure of $\Gamma$.
Since the Gaussian measure is defined on $\mathbb{R}$, I believe if we show the definition for any $(a,b]$, $a,b\in\mathbb{R}$, then the proof is done.
Both of my questions are most likely from that I am not understanding the definition. My first question is if $\Gamma$ contains the number $0$, then $\inf_{x\in\Gamma^\circ}I(x)=\inf_{x\in\Gamma^\circ}x^2/2=0$ but the right hand inequality has $\liminf_{n\rightarrow\infty} \epsilon_n\log \mu_n(\Gamma)\leq 0$, with equality only when $\Gamma=\mathbb{R}$. So, the inequality does not hold for any $\Gamma\neq\mathbb{R}$ as $\log\mu_n(\Gamma)<0$.
My second question is obtaining a lower bound for $\liminf_{n\rightarrow\infty} \epsilon_n\log \mu_n(\Gamma)$ and equivalently an upper bound for the latter. So, since the erf function is not pleasant to work with, I assumed a reasonable lower bound would be \begin{align} \liminf_{n\rightarrow\infty} \epsilon_n\log \mu_n(\Gamma)&\geq \liminf_{n\rightarrow\infty} \dfrac{1}{n}\log \sqrt{\dfrac{n}{2\pi}} e^{-x_\ast^2n/2}|\Gamma|\\ &=\liminf_{n\rightarrow\infty} \dfrac{1}{n}\left[\log\sqrt{\dfrac{n}{2\pi}}+\log e^{-x_\ast^2n/2}+\log |\Gamma| \right]\\ &=-x_\ast^2/2, \end{align} where $|\Gamma|=b-a$ is the length of the interval and $x_\ast=\sup_{x\in\Gamma}|x|$ since the Gaussian density strictly decreases as we move away from $0$. So, this comes back to the first question since $-x^2/2\leq 0$.
I would appreciate any help in understanding the definition and the problem.
The easy way to prove this is to note that $N(0,1/n)$ is the same distribution as the average of $n$ $N(0,1)$'s, and apply Cramer's theorem.
To prove this "by hand", let's first do the lower bound. Let $x^*=\inf_{x \in \Gamma^\circ} |x|$. Then for any $\delta>0$ there exists $r>0$ such that either $(-x^*+\delta-r,-x^*+\delta+r) \subset \Gamma$ or $(x^*-\delta-r,x^*-\delta+r) \subset \Gamma$.
This means that there are points in $\Gamma^\circ$ arbitrarily close to either $-x^*$ or $x^*$, and there must then be open intervals around all of those points as well. To understand why this seemingly weird technicality is required, consider the case where $\Gamma$ is just an open interval, in which case the infimum isn't attained in $\Gamma$, so we can only draw an open interval around a nearby point while staying in $\Gamma$.
Denoting this interval by $J_\delta$, you have $\mu_n(\Gamma) \geq \mu_n(J_\delta)$ and now you can estimate the right side. Pay attention to the special case $x^*=0$, where $\mu_n(\Gamma) \to 1$.
Now let's do the upper bound. Let $x^*=\inf_{x \in \overline{\Gamma}} |x|$. If $x^*=0$ then the upper bound is trivial. Otherwise, let $J=(-\infty,-x^*] \cup [x^*,\infty)$. Now $\mu_n(\Gamma) \leq \mu_n(J)$. Now apply the bound
$$\int_a^\infty e^{-x^2/2} dx \leq \int_a^\infty \frac{x}{a} e^{-x^2/2} dx$$
for $a>0$, to estimate $\mu_n(J)$.