The problem is to find the average shortest distance of uniformly random points from the hypotenuse in a right angled rectangle.
The distance d shows the shortest distance to the hypotenuse from a random point N (x1,y1). I want to find the average (mean) shortest distance from N random points inside the triangle.
What I have in mind is to integrate the distance formula of point distance to the hypotenuse. Let P be (0,0) => Q (a,0) and R (a,b) Then slope of hypotenuse will be $m = \frac{b-0}{a-0} = \frac{b}{a}$
The equation of hypotenuse be $bx -ay = 0$.
The distance d would be $$d = \frac{|b*x_1 -a*y_1|}{\sqrt{a^2 +b^2}}$$
I thought integrating it over x and y for the given range would provide me with the mean shortest distance -
$$D_{mean} = \int_{0}^{a} \int_{0}^{b} d.dx.dy = \int_{0}^{a} \int_{0}^{b} \frac{|b*x_1-a*y_1|}{\sqrt{a^2+b^2}} dx.dy$$
I am stuck in solving it... (given that my approach is the correct one.)

Take $P$ to be the origin. Then the equation of the hypotenuse is
$ y = \dfrac{b}{a} x , \hspace{15pt} x \in [0, a] $
Now pick a point $ (x_1, y_1) $ such that $ y_1 \in [0, \dfrac{b}{a} x_1 ], then its perpendicular distance from the hypotenuse is
$d (x_1) = \dfrac{ \dfrac{b}{a} x_1 - y_1 } {\sqrt{1 + \left(\dfrac{b}{a}\right)^2} } = \dfrac{ b x_1 - a y_1 }{\sqrt{a^2 + b^2}}$
So the average distance is
$ \overline{d} = \displaystyle \dfrac{1}{A} \int_{x_1=0}^{a} \int_{y_1 = 0 }^{\dfrac{b}{a} x_1} \dfrac{ b x_1 - a y_1}{\sqrt{a^2 + b^2}} dy_1 dx_1 $
where $A = \displaystyle \int_{x_1=0}^{a} \int_{y_1 = 0 }^{\dfrac{b}{a} x_1} dy_1 dx_1 = \dfrac{1}{2} a b $
Integrating with respect to $y_1$,
$\overline{d} = \dfrac{2}{ab \sqrt{a^2 + b^2}} \displaystyle \int_{x_1=0}^{a} \dfrac{b^2}{2a} x_1^2 dx_1 = \left(\dfrac{b}{a^2 \sqrt{a^2 + b^2}}\right) \left(\dfrac{a^3}{3}\right) =\dfrac{ab}{3 \sqrt{a^2 + b^2}} $