Let $S = \{(x,y)\in R^2 : x^2 + y^2 = 1\}$ be the unit circle in $R^2$. Let $(X_1, Y_1), (X_2, Y_2)$ be independent, both having uniform distribution over $S$. Let $D$ denote the Euclidean distance between $(X_1, Y_1)$ and $(X_2, Y_2)$. Show that $E(D^2) = 1$.
I think that $f_{X_1,Y_1}(x_1,y_1)= \dfrac{1}{2\pi},\text{if}\,\,(x_1,y_1)\in S$ and $D^2= (X_1-X_2)^2+(Y_1-Y_2)^2$. Then I can't proceed.
FOILing results in $$E[D^2]=E[X_1^2]+E[X_2^2]+E[Y_1^2]+E[Y^2]-2E[X_1X_2]-2E[Y_1Y_2]=4E[X_1^2]=1,$$ where I used the following obvious looking facts which will have to be verified later:
The pairs $(X_1,Y_1)$ and $(X_2,Y_2)$ are independent by definition. However, $X_i$ and $Y_i$ are not independent for the same $i$. Their joint distribution is defined such that for a measurable set of $A\subset \mathbb R^2$ $$P((X_i,Y_i)\in A)=\frac{1}{\pi}\lambda(A\cap C),$$ where $C$ is the unit disc centered at $(0,0)$ and $\lambda $ is the (Lebesgue) measure of area in $\mathbb R^2$.
In order to prove 1. and 3. we need the pdf of, say, $X_1$. I hope that, based on symmetry reasons, there is no need to prove that the distribution of the four random variables, we have here, are the same.
By definition
$$f_{X_1}(x)=\lim_{\Delta\rightarrow 0}\frac{P(x\le X_1<x+\Delta)}{\Delta}.$$ Notice that $$P(x\le X_1<x+\Delta)=P((X_1,Y_1)\in T_{\Delta,x}),$$ where $$T_{\Delta,x}=\{(u,v):x\le u<x+\Delta, -\sqrt{1-u^2}\le v \le \sqrt{1-u^2}\}$$
In the figure below $T_{\Delta,x}$ is shown in red:
Considering the definition of the joint distribution of $(X_1,Y_2)$ we have an approximation for the probability that
$$P(x\le X_1<x+\Delta)=P((X_1,Y_1)\in \color {red}{T_{\Delta,x}})\approx \Delta \frac{2}{\pi}\sqrt{1-x^2}$$
since in order for $X_1$ to fall in betwwe $x$ and $x+\Delta$ the point $(X_1,Y_1)$ has to fall in the red region. (Note that this approximation could be made absolutely precise by taking the ppropriate lower and upper bounds.)
With this the pdf of $X_1$ can be given:
$$f_{X_1}(x)=\begin{cases}\frac{2}{\pi}\sqrt{1-x^2},& \text{ if } -1\le x \le 1\\ 0,&\text {otherwise.}\end{cases}.$$
Let' check if $f_{X_1}$ is a density function. Indeed, with the help of Alpha, we get:
$$\frac{2}{\pi}\int_{-1}^{1}\sqrt{1-x^2}dx=1.$$
As far as the expectation of $X_1$ we have
$$E[X_1]=\frac{2}{\pi}\int_{-1}^1x\sqrt{1-x^2}dx=0,$$ as Alpha reinforces.
Finally, the expectation of $X_1^2$ is $$E[X_1^2]=\frac{2}{\pi}\int_{-1}^1x^2\sqrt{1-x^2}dx=\frac{2}{\pi}\frac{\pi}{8}=\frac{1}{4},$$ as Alpha reinforces again.