Drawing a triangle with 2 known corners and all side lengths

138 Views Asked by At

Assume that there are three points $A$, $B$ and $C$.

All the pairwise distances are known $(|AB|, |AC|, |BC|)$. But none of the coordinates are known. I want to draw a triangle using those points.

Since no coordinates are known, I pick $(0,0)$ for $A$ and $(0, |AB|)$ for $B$.

Now, how can I calculate the coordinates of $C$ using these corners?

I tried:
$C_x = \dfrac{{-|AB|^{2} + {|BC|}^{2} - {|AC|}^{2}}}{2|AB|}$
$C_y = \sqrt{|AC|^2 - {C_x}^2}$

But I think that does not work.

I am using trilateration algortithm to find the realtive positions of the points. When I pick 3 points with their actual coordinates, the algorithm works perfect.

Trilateration with actual seed coordinates

However, when I pick three points and estimate their coordinates using the formula above, what I get is something like this:

Trilateration with estimated seed coordinates

I think there is something wrong with the initial coordinates I'm picking.

2

There are 2 best solutions below

0
On BEST ANSWER

I would do as follows.

Pick the longest side, $c$. Draw AB of length $c$ along $x$-axis, starting from zero. Let $C_x = x$. From the right-agled triangles featuring the height of the triangle we have $$b^2 - x^2 = a^2 - (c-x)^2= a^2 - c^2 +2cx - x^2$$ $$2cx = b^2 + c^2 - a^2$$

Then the coordinates of point $C$ are $$C_x = \frac {b^2 + c^2 - a^2}{2c}$$ $$c_y=\sqrt {b^2 - C_x^2}$$

I haven't checked it thoroughly, but seems that I got it right. Oh, and of course, if it is done with given arbitrary numbers $a,b,c$ then you should first check if the triangle inequality holds or it will just going to be embarassing for you when the formulas don't work.

0
On

The point $C$ is the point of intersection of two circles $$x^2+y^2=|AC|^2$$ and $$x^2+(y-|AB|)^2=|BC|^2$$ which obviously are two $symmetric$ points.