I am very unfamiliar with Big-$\mathcal{O}$ run time calculation.
I know that for addition the run time is $\mathcal{O}(\log n)$ and for multiplication the run time is $\mathcal{O}(\log^2 n)$. How would I generalize this to calculate the run time for algorithms such as CRT and extended euclidean algorithms?
The extended Euclidean algorithm works by applying the Euclidean algorithm once, storing the intermediate results, and then backtracking, i.e. you use twice as many steps as the Euclidean algorithm (and the cost of each additional step is a multiplication and a subtraction).
Assume that you have a system of k modular equalities, then CRT requires computing products of $k-1$ numbers $k$ times, i.e. $O(k^2)$ multiplications, using extended Euclidean algo $k$ times, and taking $k$ products.
Once multiplied the numbers get at most $O(k\log n)$ bits long, hence this gives you a runtime of $O(k^2 M(k\log n) + EE(k\log n))$ where $M$ is the cost of a multiplication, and $EE$ is the cost of the extended Euclidean algo (where $n$ is the maximum value occuring in those equalities).