My problem is exactly the same as asked in here with a change in the notation of the two order statistics. Reframing the question:
If $\left(X_1,X_2,…,X_n\right)$ are a random sample from Uniform(0,1) distribution, find the correlation between the order statistics $X_{i:n}$ and $X_{j:n}$, $i<j$.
My approach was exactly similar to what the author of the linked question did:
I have used the formula for correlation coefficient as: $$\rho=\frac{E[X_{(i)}X_{(j)}]-E[X_{(i)}]E[X_{(j)}]}{\sqrt{Var(X_{(i)}) Var(X_{(j)})}}$$
And I arrived at the exact same dead-end: I'm unable to compute $E\left[X_{(i)}X_{(j)}\right]$.
What I have tried till now:
The joint pdf of $X_{i:n}$ and $X_{j:n}$ is given by $$f_{i,j:n}\left(x,y\right)=\frac{n!}{(i-1)!(j-i-1)!(n-j)!}x^{i-1}(y-x)^{j-i-1}(1-y)^{n-j}$$ where $0<x<y<1$.
Let $Z=X_{i:n}X_{j:n}$ and $W=X_{j:n}$. Thus, the joint pdf of $Z,W$ is given by $$\begin{align}f_{Z,W}\left(z,w\right)&=f_{i,j:n}\left(\frac{z}{w},w\right)\left|w^{-1}\right|\\ &=\frac{n!}{(i-1)!(j-i-1)!(n-j)!}\left(\frac{z}{w}\right)^{i-1}\left(w-\frac{z}{w}\right)^{j-i-1}\left(1-w\right)^{n-j}\frac{1}{w} \end{align}$$
Finally, we can get the pdf of $Z=XY$ as $$\begin{align}f_Z(z)&=\frac{n!}{(i-1)!(j-i-1)!(n-j)!}\int_0^1\left(\frac{z}{w}\right)^{i-1}\left(w-\frac{z}{w}\right)^{j-i-1}\left(1-w\right)^{n-j}\frac{1}{w}\cdot dw\\ &=z^{i-1}\frac{n!}{(i-1)!(j-i-1)!(n-j)!}\int_0^1\left(\frac{1}{w}\right)^{i-2}\left(w-\frac{z}{w}\right)^{j-i-1}\left(1-w\right)^{n-j}\cdot dw\end{align} $$
But I cannot proceed further from here. I cannot use a computer to do this. @wolfies provided a wonderful solution to the linked question and this question also. But those were done using the software. Any help/suggestion to how I can proceed from here would be very much appreciated.
Look at the other solution, by @Did, for an approach that avoids software. Writing $X:=X_{i:n}$ and $Y:=X_{j:n}$ as you've done, the joint density of $(X,Y)$ is $$ f_{i,j:n}\left(x,y\right)= c(i,j,n)\,x^{i-1}(y-x)^{j-i-1}(1-y)^{n-j}\tag1 $$ using the shorthand $$c(i,j,n):=\frac{n!}{(i-1)!(j-i-1)!(n-j)!}.\tag2$$ The trick is to recall that the density $f_{i,j:n}$ integrates to $1$ for every $i,j,n$. You can compute $E(Y-X)^2$ using this fact, because $$\begin{aligned} E(Y-X)^2&=\iint (y-x)^2 f_{i,j:n}(x,y)\,dxdy\\ &=c(i,j,n)\iint (y-x)^2x^{i-1}(y-x)^{j-i-1}(1-y)^{n-j}\,dxdy\\ &=c(i,j,n)\iint x^{i-1}(y-x)^{(j+2)-i-1}(1-y)^{(n+2)-(j+2)}\,dxdy\\ &=\frac{c(i,j,n)}{c(i,j+2,n+2)} \end{aligned} $$ since the last integrand has form (1). Assuming you know how to calculate the mean and variance of $X$ and $Y$, you can now compute the variance of $X-Y$ using $$\operatorname{Var}(X-Y)=E(X-Y)^2-[E(X-Y)]^2.$$ But the identity $$ \operatorname{Var}(X-Y)=\operatorname{Var}(X) +\operatorname{Var}(Y)-2\operatorname{Cov}(X,Y) $$ now allows you to solve for $\operatorname{Cov}(X,Y)$.
And if you don't know how to calculate the mean and variance of $X$ and $Y$, just use the same device: The density of $X:=X_{i:n}$ is $$ f_{i:n}(x)=a(i,n)\, x^{i-1}(1-x)^{n-i}\tag3 $$ using the shorthand $$a(i,n):=\frac{n!}{(n-i)!(i-1)!}.\tag4$$ Since the density (3) integrates to $1$, we see $$ \begin{aligned}E(X^p)&=a(i,n)\int x^p x^{i-1}(1-x)^{n-i}\,dx\\ &=a(i,n)\int x^{(i+p)-1}(1-x)^{(n+p)-(i+p)}\,dx=\frac{a(i,n)}{a(i+p,n+p)} \end{aligned} $$ noting that the final integrand has the form (3). Now substitute (4) and obtain, after some algebra, $$E(X)=\frac i{n+1},\qquad E(X^2)=\frac{(i+1)i}{(n+2)(n+1)}$$ and $$\operatorname{Var}(X)=E(X^2)-[E(X)]^2=\frac{i(n+1-i)}{(n+2)(n+1)^2}.$$