I'm trying to prove a claim given in Vershynin's Book - High-dimensional probability.
The notation used is $g$ as a vector, $\sim$ represents similarity. $N(\mu, \sigma^2)$ is a normal distribution with $\mu$ mean and $\sigma^2$ variance. $t \in \mathbb{R}$ and $\mathbb{P}$ is a notation for probability.
The author says:
Given $g \sim N(0, I_{d \times d})$ a standard Gaussian vector in d-dimensions then \begin{equation} \forall t \ge 0: \mathbb{P} \{ | \|g \|_2^2 - d | \ge t \} \le 2 \; \mathrm{exp}\left( - \frac{t^2}{C_1 d + C_2 t} \right) \end{equation} for arbitrary $C_1$ and $C_2$.
Continuing the claim it extends to random matrices too. As stated as:
Given a random matrix $\Gamma \in \mathbb{R}^{d \times k}$ i.i.d. $N(0,1)$ For any $v \in \mathbb{R}^k$, and $\| v\|_2 = 1$ (normalized) \begin{equation} \forall t \ge 0: \mathbb{P}\{ | \| \Gamma v \|_2^2 - d | \ge t\} \le 2 \; \mathrm{exp}\left( - \frac{t^2}{C_1 d + C_2 t} \right) \end{equation}
I think if I can prove the first one the second part is simple since $\Gamma v \sim N(0, I_{d \times d})$ given that a centered normal is invariant by rotation.
Write $\|g\|_2^2=\sum_{i=1}^d g_i^2$ and note that the $g_i$ are i.i.d with expectation $1$.
Right tail
Chernoff's bound yields for $s\geq 0$
$$P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\geq t\right)\leq e^{-st} E(e^{s(g_1^2-1)})^d$$
Note that $\displaystyle E(e^{s(g_1^2-1)}) = \frac{e^{-s}}{\sqrt{2\pi}}\int e^{-x^2(\frac 12 -s)} dx$ which is finite only if $s<\frac 12$, and in that case a substitution proves that $\displaystyle E(e^{s(g_1^2-1)}) = \frac{e^{-s}}{\sqrt{1-2s}}$ hence $$P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\geq t\right)\leq e^{-st} \frac{e^{-sd}}{(1-2s)^{d/2}}$$ This is minimized in $s$ for $s=\frac 12 \frac t{t+d}$ thus $$P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\geq t\right)\leq e^{-t/2}\left(1+\frac td \right)^{d/2}$$
This is not exactly the bound you're looking for, and it is tighter as $t\to \infty$ provided that $C_2>2$. To get the wanted bound, note that $$e^{-t/2}\left(1+\frac td \right)^{d/2} = \exp\left(-t^2\left[\frac 1{2t}-\frac{d}{2t^2}\log\left(1+\frac td\right) \right] \right)$$ and $$\frac 1{2t}-\frac{d}{2t^2}\log\left(1+\frac td\right) = \frac 1{2t}\left(1-\frac dt \log\left(1+\frac td\right)\right) $$
Numerically, I observe (and this inequality is referenced on Wikipedia) that $$\forall x\geq 0,\; 1-\frac 1x \log\left(1+ x\right)\geq \frac{1}{2\left(1+\frac 1x \right)}$$ hence $$\frac 1{2t}\left(1-\frac dt \log\left(1+\frac td\right)\right)\geq \frac{1}{4t\left(1+\frac dt \right)} = \frac{1}{4t+4d}$$ thus finally
$$P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\geq t\right)\leq \exp\left(-\frac{t^2}{4t+4d} \right)$$
Left tail
Since $$P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\leq -t\right) = P\left(\sum_{i=1}^d [-g_i^2+E(g_i^2)]\geq t\right)$$ and $\displaystyle \sum_{i=1}^d [-g_i^2+E(g_i^2)] \leq d$ we have $$t>d \implies P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\leq -t\right) = 0$$ so it suffices to get an upper bound when $t\leq d$.
Chernoff's bound yields $$P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\leq -t\right)\leq e^{t/2}\left(1-\frac td \right)^{d/2}$$ and numerically $$\forall x\in [0,1),\; 1+\frac 1x \log\left(1- x\right)\leq \frac{1}{2\left(1+\frac 1x \right)}$$ thus $$t\leq d \implies P\left(\sum_{i=1}^d [g_i^2-E(g_i^2)]\leq -t\right)\leq \exp\left(-\frac{t^2}{4t+4d} \right)$$
Finally, $$P\left(|\|g\|_2^2-d|\geq t\right)\leq 2\exp\left(-\frac{t^2}{4t+4d} \right)$$