I am currently creating a game where archers will have to shoot each other in a 3D space.
I have got the correct X and Z directions to shoot the arrow torward the enemy archer, but how can I know how high to shoot the arrow so that it lands in the correct place?
Every update the arrow will move by it's velocity, and the velocity's y that is given to it will go down by 0.1 every time.
I could potentially check the entire arrow's projectile path and see where it lands, but that is intensive, and I was wondering if there was a mathematical solution.
Certainly, as long as you are inoring air resistance and arrow-leveling effects from the feathers.
The initial upward velocity is $ v_y = s \sin \theta$ where $s$ is the initial speed of the arrow and $\theta$ is the initial angle of the shot (with respect tot the ground). This velocity will decrease bby the gravitational acceleration, 9.8 meters per second less each second.
The initial horizontal velocity is $v_x = s \cos \theta$. This will stay constant
To hit the target you want to arrive at the correct horizontal distance at just the time the arrow comes to the height of the target.
If the target is at the same height at the shooting point of the arrow, and at some distance $d$, this gives in the end $$ \sin 2\theta = \frac{gd}{s^2}$$.
Even if the heights differ, you just end up with a quadratic equation for $cos^2 \theta$ and you can solve that with the quadratic formula.
Notice that in both cases their are generally two solutions (except if you are shooting the arrow as far as it can go given this initial speed $s$). One has you shooting high and falling on the target; the other, shooting lower and hitting the target in the face.