I am having some trouble figuring out the time complexity in big theta notation of the following algorithms. Any help is appreciated.
sum = 0 ;
for ( i = 0 ; i < n ; i++ )
for ( j = 1 ; j < n^2 ; j = 4 * j ;
sum ++ ;
and
sum = 0 ;
for ( i = n ; i > 0 ; i = i/4 )
for ( j = 0 ; j < n^ 2 ; j++ )
sum++;
The first program has two loops, an outer loop that interates $n$ times and an inner loop that iterates approximately $\log_4(n^2)=2\log_4(n)$ times. Thus there are approximately $2n\log_4(n)$ constant time instructions (with the ratio tending to $1$ as $n$ tends to infinite), hence the program runs in $\Theta(n\log n)$ time.
For the second program, the outer loop runs approximately $\log_4 n$ times, and the inner loop iterates $n^2$ times. Thus there are approximately $n^2\log_4 n$ constant time instructions (with the ratio tending to $1$ as $n$ tends to infinity), hence the program runs in $\Theta(n^2\log n)$ time.