Let $B_i(n)$ be the $i$th bit in the binary expansion of $n$, so that $n=\sum B_i(n)2^i$. Now let $n$ be randomly and uniformly chosen from some large range, and let $E(j)$ be the expected value of $B_j\bigl(n^2\bigr)$, the $j$th bit in the expansion of $n^2$. That is:
$$E(j) = \lim_{M\to\infty} \frac1M \sum_{n=0}^{M-1}B_j\bigl(n^2\bigr)$$
if this limit exists. It is not hard to see that it must exist for any fixed $j$, since the function $B_j\bigl(n^2\bigr)$ is completely determined by the value of $n\bmod 2^{j+1}$, and so is periodic with period at most $2^{j+1}$. In fact we can get rid of the limit:
$$E(j) = \frac1{2^{j+1}} \sum_{n=0}^{2^{j+1}-1}B_j\bigl(n^2\bigr)$$
For example, the first few values of $E$ are $\frac12, 0, \frac14, \frac14$.
Numerical evidence suggests that:
$$\lim_{j\to\infty} E(j) = \frac12$$
Is this true?
According to "Distribution of the figures 0 and 1 in the various orders of binary representation of kth powers of integers", W. Gross and R. Vacca (Mathematics of Computation, April 1968, 22, #102, 423–427), the answer is yes.
On page 423 they define a function $N_k(h)$, which is the count of 1 bits in the $h$th position of the sequence $n^k$ over one of its periods, so my $E(j)$ is exactly $N_2(j)2^{-(j+1)}$. They then show (page 424) that
$$E(j) = \frac12\left(1 - 2^{-\lfloor j/2\rfloor}\right)$$
except for $j=0$. A similar result holds for arbitrary $k$th powers—the density of high-order bits approaches $\frac12$ for all $k$.