Problem related to projection of points onto a line

116 Views Asked by At

Let $r_1, . . . , r_n$ be positive reals, and let $a_1 = (x_1, y_1), . . . , a_n = (x_n, y_n)$ be points. Denote the point a by $ \frac{r_1a_1+...r_na_n}{r_1+···+r_n}$, and let $l$ be any line passing through a. For $i ∈ [n]$, denote by $a'_i$ the projection of $a_i$ onto the line. Prove that a $ = \frac{r_1a'_1+···+r_na'_n}{r_1+···+r_n}$.

What I have tried: let equation of line $l$ be $y=mx + c$ and a can be simplified to $(\frac{r_1x_1+...r_nx_n}{r_1+···+r_n} , \frac{r_1y_1+...r_ny_n}{r_1+···+r_n})$ I was thinking to put this into the equation of line and solve but I did not end up in any simplification. The result became very messy. And I initially thought of doing something with this projection concept and use them also in the equation of line but I am unable to progress further. I think it should be easy but I am not able to solve it.

Could anyone please help me to solve this?

Thanks in advance.

2

There are 2 best solutions below

3
On

Assume the equation of the line $l$ is $\textbf{r} = \textbf{a} + \lambda\textbf{d}. $
Since $\textbf{a}^{'}_{i}$ is the projection of $\textbf{a}_i$ onto the line l, then: \begin{align*} \begin{cases} (\textbf{a}_i -\textbf{a}^{'}_{i})\cdot \textbf{d} = 0\qquad\text{(1)}\\ \textbf{a}^{'}_{i} = \textbf{a} + \lambda_i\textbf{d}\qquad\qquad\text{(2)} \end{cases}. \end{align*} Substituting (2) into (1), we get: \begin{align*} &(\textbf{a}_i -\textbf{a} - \lambda_i\textbf{d})\cdot\textbf{d}&= 0\\ \implies& \lambda_i |\textbf{d}|^2=(\textbf{a}_i -\textbf{a})\cdot\textbf{d}\\ \implies& \lambda_i = \frac{(\textbf{a}_i -\textbf{a})\cdot\textbf{d}}{|\textbf{d}|^2} =\frac{(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}}{|\textbf{d}|} \qquad\text{where $\hat{\textbf{d}}$ is the unit vector in the direction of $\textbf{d}$}. \end{align*} With this expression of $\lambda_i$ and (2) we have: \begin{align*} & \textbf{a}^{'}_{i} =\textbf{a} + \frac{(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}}{|\textbf{d}|} \textbf{d} = \textbf{a} + [(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & r_i\textbf{a}^{'}_{i} = r_i\textbf{a} + r_i[(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & r_i\textbf{a}^{'}_{i} = r_i\textbf{a} + (r_i\textbf{a}_i -r_i\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a} + \sum^{n}_{i=1}[(r_i\textbf{a}_i -r_i\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \textbf{a}\sum^{n}_{i=1}r_i +\bigg[(\sum^{n}_{i=1}r_i\textbf{a}_i -\sum^{n}_{i=1}r_i\textbf{a})\cdot\hat{\textbf{d}}\bigg] \hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \frac{\sum^{n}_{i=1}r_i\textbf{a}_i}{\sum^{n}_{i=1}r_i}\cdot\sum^{n}_{i=1}r_i +\bigg[(\sum^{n}_{i=1}r_i\textbf{a}_i -\sum^{n}_{i=1}r_i\textbf{a})\cdot\hat{\textbf{d}}\bigg]\hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a}_i+\bigg[(\sum^{n}_{i=1}r_i\textbf{a}_i -\sum^{n}_{i=1}r_i\textbf{a}_i)\cdot\hat{\textbf{d}}\bigg]\hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a}_i+\bigg[(\textbf{0})\cdot\hat{\textbf{d}}\bigg]\hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a}_i\\\\ \implies & \frac{\sum^{n}_{i=1}r_i\textbf{a}^{'}_{i}}{\sum^{n}_{i=1}r_i} = \frac{\sum^{n}_{i=1}r_i\textbf{a}_{i}}{\sum^{n}_{i=1}r_i} = \textbf{a}\\\\ \end{align*} $\therefore$ $\textbf{a} = \frac{r_1\textbf{a}^{'}_1+r_2\textbf{a}^{'}_2+\cdots r_n\textbf{a}^{'}_n}{r_1 + r_2 + \cdots + r_n}.$

0
On

Given $r_1,r_2,\dots, r_n\in\mathbb R^{>0}$, then $ \frac{r_i}{r_1+\dots +r_n}$ are weights with $i\in\{1,\dots ,n\}= [n]\,$.
Thus, a $ =\frac{\sum_{i=1}^n r_ia_i}{\sum_{i=1}^n r_i}$ is a convex linear combination of the $a_i$, i.e., a lies in the convex hull of the $a_i$.

Now perform a (global) translation of all the involved points by $-$a so that the line $\ell$ passes through the origin, hence is a $1$-dimensional subspace and accordingly may be described by some unit vector $u$. The projection on $\ell$, which is assumed to be an orthogonal one, is given by $$x\longmapsto x'=(u\cdot x)\, u\,.$$ This follows from considering the scalar-product of $\,x=\underbrace{\:x'}_{\;=\,\lambda u}+ \underbrace{x-x'}_{\;\perp\, u}$ with $\,u$. Taking perpendicularity into account one obtains $\lambda = u\cdot x$.

The translated vectors then satisfy $$a_i'-\textbf{a} \;=\; \big(u\cdot (a_i-\textbf{a})\big)\, u \quad\implies\:a_i' \;=\;\big(u\cdot (a_i-\textbf{a})\big)\, u \;+\;\textbf{a}$$ Finally, $$\frac{\sum_{i=1}^n r_ia_i'}{\sum_{i=1}^n r_i} \;=\;u\cdot \underbrace{\frac{\sum_{i=1}^n r_i\,(a_i-\textbf{a})}{\sum_{i=1}^n r_i}}_{\;=\,0}\; u \;+\;\frac{\sum_{i=1}^n r_i\,\textbf{a}}{\sum_{i=1}^n r_i} \;=\;\textbf{a}$$ Remark: $\,$The present result does not depend on the given finite cloud of points $\{a_i\}$ being a subset of $\mathbb R^2$, but in fact it is true in an inner-product space of higher dimension, incl. infinite-dimensional ones, without need to adapt the line of proof.