I'm taking a course at teaching and we have some geometry questions. Among the questions there was one I couldn't solve.
I'm trying to prove that angle bisectors in a triangle intersect at a single point, but I need to show it analytically, i.e using line equations and distance between points. I'm familiar with proofs using geometry and vectors, but couldn't manage to show analytically. I can't use the formula of angle between lines or distance of point from a given line.
Things I've tried:
- angle bisector theorem.
- Writing equations of sides as $Ax+By+C=0$ and then creating the equation of angle bisector - quite messy and I believe there is a lot easier way.
Thanks!
Without loss of generality, we can set: $A=(0,0)$, $B=(1,0)$ and $C=(a,b)$, with $b>0$. By the tangent bisection formula one gets then the slopes $m$ and $m'$ of the angle bisectors through $A$ and $B$: $$ m=\sqrt{{1-a/\sqrt{a^2+b^2}\over1+a/\sqrt{a^2+b^2}}}, \quad m'=-\sqrt{{1-(1-a)/\sqrt{(1-a)^2+b^2}\over1+(1-a)/\sqrt{(1-a)^2+b^2}}}. $$ You can now find the equations of the two lines and their intersection $G$. It remains to show that $CG$ is the bisector of angle $C$. I think the simplest way to do that is finding the intersection $H$ between line $CG$ and the $x$ axis, and checking that $AH/BH=AC/BC$.