Given n independent random variables $X_1$, $X_2$, $\ldots X_n$ taking values in range $ 0$ to $1$, Chernoff bound can be stated as follows. Define the random variable $ X = X_1+X_2+\ldots X_n - \left \langle X_1 \right \rangle - \left \langle X_2\right \rangle - \left \langle X_n \right \rangle $. Then the moment generating function satisfies $ \left \langle e^{tX} \right \rangle \leq e^{c\cdot nt^2}$ for all $t>0$, where $c$ is a constant. Here $ \left \langle . \right \rangle $ denotes expectation.
Is there a converse to this in the following sense? Consider n random variables $X_1$, $X_2$, $\ldots X_n$ distributed according to joint probability distribution $P(x_1,x_2,\ldots x_n)$. Define the random variable $ X = X_1+X_2+\ldots X_n - \left \langle X_1 \right \rangle - \left \langle X_2\right \rangle - \left \langle X_n \right \rangle $, where expectation is taken using probability distribution $P$. If it is known that $ \left \langle e^{tX} \right \rangle \leq e^{d\cdot nt^2}$ for some constant $d$ and all $t>0$, then is $P$ close to a product distribution $Q_1(x_1)Q_2(x_2)\ldots Q_n(x_n)$?
If there is a known result using some other characterisation of Chernoff bound (such as using moments, instead of moment generating function), then that's good as well.