I am going over the asymptotic runtime of regular matrix multiplication.
Here is a lecture slide I am referencing(too much to type out, shown below), from Algorithms 
Everything makes sense up until the point that the author states that "each addition takes $O(n^2)$ time". Can anyone explain that?
I have a counter example here. Say I have a 4 by 4 matrix. The product of $A$ and $E$ would result in something like (2 by 2 matrix) $$ \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ \end{bmatrix} $$ and the product of $B$ and $G$ would result in something like $$ \begin{bmatrix} 5 & 6 \\ 7 & 8 \\ \end{bmatrix} $$ The author argues that one addition runs in $O(n^2)$ but I argue that one addition runs in $O(n)$. Here to add $AE$ and $BG$, you perform additions 1 + 5, 2 + 6, 3 + 7, and 4 + 8, or 4 additions. 4 was the original n, so the runtime of one addition would be $O(n)$
Do you guys agree with my counterexample/argument or did I miss something and runtime of one addition would be $O(n^2)$?
Your matrix is a $2 \times 2$ matrix, which has $2^2=4$ entries. In your example your $n=2$ and $n^2 = 4$.
Adding two $n \times n$ matrices (which have $n^2$ entries) scale as $\mathcal{O}(n^2)$, since you would need to add $n^2$ entries of the first matrix with the $n^2$ entries of the second matrix.