Consider a random variable $X \ge 0$ which takes values in an interval $[0, b]$, and further $$ \text{P}(X \ge t) \le C \exp\left(\frac{-t^{2}}{B}\right), \quad \forall t \ge 0, $$ for given constants $C \gg 1$ and $B >0$.
Since $X$ is bounded, it is a sub-Gaussian variable, and its variance proxy can be upper bounded by $O\left((b-0)^{2}\right)$ based on the length of the interval.
Q1: First, a clarification on the definition: if we temporarily ignore the fact that $X$ is bounded (but taking into account that $X \ge 0$), then is the above tail bound enough to say that $X$ is sub-Gaussian? (E.g., does the value of $C$ matter?)
Q2: Using the tail bound, is it possible to get a better upper bound on the variance proxy? In particular, I saw a claim that based on the above tail bound, the moments of $X-\mathbb{E}[X]$ can upper be bounded by those of a Gaussian with variance $O(B \sqrt{\log{C}})$. Is that true?
Edit: To bound all moments of $X-\mathbb{E}[X]$ by those of a Gaussian with variance $\gamma$, I would need to show that $X-\mathbb{E}[X]$ is sub-Gaussian with variance proxy $\gamma > 0 $, i.e., that $\mathbb{E}[e^{s(X-\mathbb{E}[X])}] \le e^{s^{2}\gamma/2}$. Motivated by Michael's answer, which gives an upper bound on the variance $\sigma^{2}$ of $X$, we could put the question this way: is there a straightforward connection between $\gamma$ and $\sigma^{2}$? I see a related question here: Bound variance proxy of a subGaussian random variable by its variance
Considering both bounds, we know that: $$P[X > t] \leq \left\{ \begin{array}{ll} \min[1,C e^{-t^2/B}] &\mbox{ if $t \in [0,b)$} \\ 0 & \mbox{ if $t\geq b$} \end{array} \right. $$ This is the tightest bound since we can consider a random variable $W$ with $P[W>t]$ given exactly by the right-hand-side of the above inequality. Notice that $C e^{-t^2/B} \geq 1$ whenever $t \in [0, \sqrt{B\log(C)}]$. For simplicity assume that $b \geq \sqrt{B\log(C)}$.
We know that, for general nonnegative random variables $Y$, we have $Y=\int_{0}^{\infty} 1\{Y>t\}dt$ and hence $E[Y]=\int_{0}^{\infty}P[Y>t]dt$. Thus,
\begin{align} E[X^2] &= \int_{0}^{\infty} P[X^2>t]dt \quad \mbox{[since $X^2 \geq0$]}\\ &= \int_0^{\infty} P[X > \sqrt{t}]dt \quad \mbox{[since $X \geq 0$]}\\ &=\int_0^{B\log(C)} \underbrace{P[X>\sqrt{t}]}_{\leq 1}dt + \int_{B\log(C)}^{b^2}\underbrace{P[X>\sqrt{t}]}_{\leq Ce^{-t/B}}dt \quad \mbox{[since $P[X>b]=0$]} \\ &\leq B\log(C) + -CBe^{-t/B}|_{B\log(C)}^{b^2}\\ &=B\log(C) + B - CBe^{-b^2/B} \end{align} This is the best upper bound on $E[X^2]$ since it holds with equality for the random variable $W$ defined above. A simpler bound is then $E[X^2] \leq B\log(C) + B$, and this holds regardless of the value of $b$. (It even holds when $0\leq b < \sqrt{B\log(C)}$, since reducing the value of $b$ cannot increase the bound.)
In particular, $\sigma=\sqrt{Var(X)} \leq \sqrt{E[X^2]} \leq \sqrt{B + B\log(C)}$.