Let $r_1, . . . , r_n$ be positive reals, and let $a_1 = (x_1, y_1), . . . , a_n = (x_n, y_n)$ be points. Denote the point a by $ \frac{r_1a_1+...r_na_n}{r_1+···+r_n}$, and let $l$ be any line passing through a. For $i ∈ [n]$, denote by $a'_i$ the projection of $a_i$ onto the line. Prove that a $ = \frac{r_1a'_1+···+r_na'_n}{r_1+···+r_n}$.
What I have tried: let equation of line $l$ be $y=mx + c$ and a can be simplified to $(\frac{r_1x_1+...r_nx_n}{r_1+···+r_n} , \frac{r_1y_1+...r_ny_n}{r_1+···+r_n})$ I was thinking to put this into the equation of line and solve but I did not end up in any simplification. The result became very messy. And I initially thought of doing something with this projection concept and use them also in the equation of line but I am unable to progress further. I think it should be easy but I am not able to solve it.
Could anyone please help me to solve this?
Thanks in advance.
Assume the equation of the line $l$ is $\textbf{r} = \textbf{a} + \lambda\textbf{d}. $
Since $\textbf{a}^{'}_{i}$ is the projection of $\textbf{a}_i$ onto the line l, then: \begin{align*} \begin{cases} (\textbf{a}_i -\textbf{a}^{'}_{i})\cdot \textbf{d} = 0\qquad\text{(1)}\\ \textbf{a}^{'}_{i} = \textbf{a} + \lambda_i\textbf{d}\qquad\qquad\text{(2)} \end{cases}. \end{align*} Substituting (2) into (1), we get: \begin{align*} &(\textbf{a}_i -\textbf{a} - \lambda_i\textbf{d})\cdot\textbf{d}&= 0\\ \implies& \lambda_i |\textbf{d}|^2=(\textbf{a}_i -\textbf{a})\cdot\textbf{d}\\ \implies& \lambda_i = \frac{(\textbf{a}_i -\textbf{a})\cdot\textbf{d}}{|\textbf{d}|^2} =\frac{(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}}{|\textbf{d}|} \qquad\text{where $\hat{\textbf{d}}$ is the unit vector in the direction of $\textbf{d}$}. \end{align*} With this expression of $\lambda_i$ and (2) we have: \begin{align*} & \textbf{a}^{'}_{i} =\textbf{a} + \frac{(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}}{|\textbf{d}|} \textbf{d} = \textbf{a} + [(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & r_i\textbf{a}^{'}_{i} = r_i\textbf{a} + r_i[(\textbf{a}_i -\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & r_i\textbf{a}^{'}_{i} = r_i\textbf{a} + (r_i\textbf{a}_i -r_i\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a} + \sum^{n}_{i=1}[(r_i\textbf{a}_i -r_i\textbf{a})\cdot\hat{\textbf{d}}] \hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \textbf{a}\sum^{n}_{i=1}r_i +\bigg[(\sum^{n}_{i=1}r_i\textbf{a}_i -\sum^{n}_{i=1}r_i\textbf{a})\cdot\hat{\textbf{d}}\bigg] \hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \frac{\sum^{n}_{i=1}r_i\textbf{a}_i}{\sum^{n}_{i=1}r_i}\cdot\sum^{n}_{i=1}r_i +\bigg[(\sum^{n}_{i=1}r_i\textbf{a}_i -\sum^{n}_{i=1}r_i\textbf{a})\cdot\hat{\textbf{d}}\bigg]\hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a}_i+\bigg[(\sum^{n}_{i=1}r_i\textbf{a}_i -\sum^{n}_{i=1}r_i\textbf{a}_i)\cdot\hat{\textbf{d}}\bigg]\hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a}_i+\bigg[(\textbf{0})\cdot\hat{\textbf{d}}\bigg]\hat{\textbf{d}}\\\\ \implies & \sum^{n}_{i=1}r_i\textbf{a}^{'}_{i} = \sum^{n}_{i=1}r_i\textbf{a}_i\\\\ \implies & \frac{\sum^{n}_{i=1}r_i\textbf{a}^{'}_{i}}{\sum^{n}_{i=1}r_i} = \frac{\sum^{n}_{i=1}r_i\textbf{a}_{i}}{\sum^{n}_{i=1}r_i} = \textbf{a}\\\\ \end{align*} $\therefore$ $\textbf{a} = \frac{r_1\textbf{a}^{'}_1+r_2\textbf{a}^{'}_2+\cdots r_n\textbf{a}^{'}_n}{r_1 + r_2 + \cdots + r_n}.$