I don't really have good formal education in theoretical mathematics, so please don't be upset if this is obvious question, but on the other hand I don't believe I am the first one to think of such problem. So let p,q be Natural numbers q != 0 (we will be thinking of them as a dividend and divisor of a fraction) and procedure looks more or less like that:
f(p,q):
if (gcd(p,q) > 1) return f(p/gcd(p,q), q/gcd(p,q))
if (p/q > 1/3) then return f(p+1, q+4)
if (p/q < 1/3) then return f(p+1, q+1)
if (p/q == 1/3) then return WIN
so the case that it should end in some infinite number of steps is intuitive - when we are larger than 1/3 we step towards 1/4 and if we are smaller we step towards 1. Running simple script given result that for all pairs of numbers (up to 5 of decimal digits in size) it ends at some time, so my question is "Can anyone prove that this procedure finishes after some finite number of steps and not loops infinitely for any numbers p and q as described?"
Let's walk backwards from the final result: $3x-y=0$ (both $x$ and $y$ are integers).
Starting with any $x$ and $y$ the logic is as follows: when $3x - y <0$, we add one to both $x$ and $y$ so the expression becomes $3x+3-y-1=3x-y+2$. Increasing by $2$ we either hit $0$ or overshoot it by $1$.
When $3x-y>0$ we add one to $x$ and we add $4$ to $y$ so the expression becomes $3x+3-y-4=3x-y-1$. Now it's obvious that we cannot miss zero as we are decreasing the expression by one until we get to zero.
The logic that involves GCD just speeds up the process as it reduces $p$ and $q$, you do not really need it to get the win.