Let $X_i$ ∼ Bernoulli(p) iid, i = 1, . . . , 100. Give an approximation for the smallest interval that will contain $(\sum{X_i)^2}$ with 68 percent probability.
I know that 68% of the data should live within 1 standard deviation $\sigma$ away from the mean, but I don't know how to apply that to $(\sum{X_i)^2}$.
Also, I'm not quite sure what kind of values the interval could even take on. I was thinking that $\sum{X_i}$ could take on values 0, 1, 2, ..., 100, so $(\sum{X_i)^2}$ could take on values 0, $1^2$, $2^2$, $3^2$, ..., $100^2$. Is this line of thinking correct?
The sum of Bernoulli is Binomial, which can be approximated by a Gaussian.
If we were dealing with truly continuous distribution, then the square of a Gaussian with non-unity variance is non-central Chi-square scaled by the variance.
However, as you were already contemplating, to find the bound "in probability" the square of a discrete distribution ($1^2,2^2,\ldots,n^2$), it suffices to bound the original discrete distribution ($1 \sim n$).
Denote $Y \equiv \sum_{i = 1}^n X_i$, then the probability mass function is $$ \Pr\{ Y = k\} = \frac{ n! }{k! \, (n-k)!} p^k (1-p)^{n-k}$$ which can be approximated by $\mathcal{N}(\mu, \sigma)$ where $\mu = np$ and the variance is $\sigma^2 = np(1-p)$.
The smallest bound for a Gaussian is the symmetric interval centered at the mean. You need a large interval to bound the same amount of probability mass centered somewhere along the tail. The desired 68% is usually understood (in introductory courses) as one standard deviation: $\mu \pm \sigma$. $$np - \sqrt{np(1-p)} \leq \sum_{i = 1}^n X_i \leq np + \sqrt{np(1-p)} \qquad \text{with $\approx 68\%$ probability}\\ \text{or} \qquad n^2p^2 + np(1-p) -2np\sqrt{np(1-p)} \leq \left( \sum_{i = 1}^n X_i \right)^2 \leq n^2p^2 + np(1-p) \mathbf{{}+{}} 2np\sqrt{np(1-p)} \\ $$ Next you can factor the common $np$ to make it slightly shorter, or plugin the numbers $n=100$ and whatever $p$ is.