A few days ago I found an interesting limit on the "problems blackboard" of my University: $$\lim_{n\to +\infty}\int_{(0,1)^n}\frac{\sum_{j=1}^n x_j^2}{\sum_{j=1}^n x_j}d\mu = 1.$$ The correct claim, however, is: $$\lim_{n\to +\infty}\int_{(0,1)^n}\frac{\sum_{j=1}^n x_j^2}{\sum_{j=1}^n x_j}d\mu = \frac{2}{3}.$$ In fact, following @Tetis' approach we have: $$I_n = \int_{(0,1)^n}\frac{\sum_{j=1}^n y_j^2}{\sum_{j=1}^{n}y_j}d\mu = \int_{(-1/2,1/2)^n}\frac{\frac{1}{2}+\frac{2}{n}\sum_{j=1}^n x_j^2+\frac{2}{n}\sum_{j=1}^n x_j}{1+\frac{2}{n}\sum_{j=1}^n x_j}d\mu,$$ now setting $x_j=-z_j$ and summing the two integrals $$I_n = \int_{(-1/2,1/2)^n}\frac{\frac{1}{2}+\frac{2}{n}\sum_{j=1}^n x_j^2+\frac{4}{n^2}\left(\sum_{j=1}^n x_j\right)^2}{1-\frac{4}{n^2}\left(\sum_{j=1}^n x_j\right)^2}d\mu$$ follows, so: $$ I_n-\frac{2}{3}=\int_{(-1/2,1/2)^n} \left(-\frac{1}{2}+\frac{2}{n}\sum_{j=1}^n x_j^2\right)\frac{\frac{4}{n^2}\left(\sum_{j=1}^n x_j\right)^2}{1-\frac{4}{n^2}\left(\sum_{j=1}^n x_j\right)^2}d\mu < 0,$$ $$ \left| I_n-\frac{2}{3}\right|\leq\int_{\sum x_i^2\leq\frac{n}{4}}\left(\frac{1}{2}-\frac{2}{n}\sum_{j=1}^n x_j^2\right)\frac{\frac{4}{n^2}\left(\sum_{j=1}^n x_j\right)^2}{1-\frac{4}{n^2}\left(\sum_{j=1}^n x_j\right)^2}d\mu,$$ $$ \frac{2}{3}-I_n\leq \frac{n^{n/2}}{2^{n+1}} \int_{\sum x_i^2\leq 1}\left(1-\sum_{j=1}^{n}x_j^2\right)\frac{x_1^2}{1-x_1^2}d\mu.$$ The last bound, anyway, is too crude, since the RHS is $$ \Theta\left(\left(\sqrt{\frac{e\pi}{2}}\right)^n\frac{\log n}{n^{3/2}}\right).$$
My question now is: what is the asymptotic behaviour of $I_n$?
A second one is: can we prove $I_n\geq\frac{2}{3}-\frac{C}{n}$, for a suitable positive constant $C$, without the Central Limit Theorem?
UPDATE: After a few I came out with a proof of my own. The challenge is now to give the first three terms of the asymptotics, and possibly a continued fraction expansion for $I_n$.
Being unsure about what @Tetis' three (so far) answers achieve exactly, let me post the asymptotic behaviour @Byron's approach yields, pushing things one step further.
Using the notations in @Byron's post, one sees that $I_n$ is $n$ times the expectation of $X_1^2/S_n$, where $S_n=\sum\limits_{k=1}^nX_k$. Define $Z_n$ by the identity $$S_n=nE[X]+\sqrt{n}\sigma(X)Z_n,$$ where $\sigma^2(X)$ denotes the variance of every $X_k$. Then $Z_n$ converges in distribution to a centered normal random variable. Using the expansion $1/(1+t)=1-t+t^2+o(t^2))$ when $t\to0$, one gets $$ n\frac{X_1^2}{S_n}=\frac{X_1^2}{E[X]}\left(1-\frac{\sigma(X)Z_n}{\sqrt{n}E[X]}+\frac{\sigma^2(X)Z_n^2}{nE[X]^2}+o\left(\frac1n\right)\right), $$ hence $$ I_n=nE\left[\frac{X_1^2}{S_n}\right]=\frac1{E[X]}\left(E[X^2]-\frac{\sigma(X)E[X_1^2Z_n]}{\sqrt{n}E[X]}+\frac{\sigma^2(X)E[X_1^2Z_n^2]}{nE[X]^2}+o\left(\frac1n\right)\right). $$ Expanding $Z_n$, one sees that $$ \sigma(X)\sqrt{n}E[X_1^2Z_n]=E[X^3]-E[X^2]E[X], \qquad E[X_1^2Z_n^2]=E[X^2]+o(1), $$ hence, $$ \lim_{n\to\infty}n\cdot\left(I_n-\ell_X\right)=\kappa_X, $$ where $$ \ell_X=\frac{E[X^2]}{E[X]},\qquad\kappa_X=\frac{E[X^2]^2-E[X^3]E[X]}{E[X]^3}. $$ In the case at hand, $E[X^k]=\frac1{k+1}$ for every positive $k$, hence $$ \lim_{n\to\infty}n\cdot\left(I_n-\frac23\right)=-\frac19. $$