I need to find the angle [α] of a projectile launch knowing only the initial velocity and the coordinates of a target that projectile needs to hit [$x_*$, $y_*$]. Coordinates of the start point are also given.
I could've find the angle knowing the projectile maximum height value [$h_{max}$] or at least projectile flight range; [$y_0$] and [$y_*$] both would be zero, which makes the equation a lot easier.
But with this additional parameter of a target point which the projectile needs to hit, it becomes too complicated for me. Could you advise how can I find the angle? Here's all values I have:
$V_0=3000$ m/s
$x_0=-20$ m
$y_0=0$
$x_*=15000$
$y_*=50$
α-?

The flight range is just the distance between the launch point and the target. You can calculate this from the given values of the coordinates ,$\ (x_0,y_0)=(-20,0)\ $ of the launch point and $\ (x_*,y_*)=(15000,50)\ $ of the target, and the formula $$ d_*=\sqrt{(x_*-x_0)^2+(y_*-y_0)^2} $$ for the distance $\ d_*\ $ between any two points with those coordinates. Since the launch angle depends only on that distance, the launch velocity, and the acceleration due to gravity, and not on the direction in of the target from the launch point, your answer will be the same as if $\ y_0\ $, $\ x_0\ $, and $\ y_*\ $ were all equal to zero and $\ x_*\ $ were equal to $\ d_*\ $. You imply that you already know how to solve that problem, which is all you need to do.