So I was able to show that:
$\log(n!) = O(n \log n)$ without any problems.
My question is when trying to prove that $\log (n!) = \Omega(n \log n)$.
I was able to show that:
\begin{align} \log(n!) & = \log(1 \cdot 2 \cdot 3 \cdots n) \\ & = \log(1) + \log(2) + \log(3) + \dots + \log(n) \\ & = \log(1) + \dots + \log\left(\frac{n}{2}\right) + \dots + \log(n) \\ & \geq \log\left(\frac{n}{2}\right)+ \log\left(\frac{n}{2} + 1\right) + \dots + \log(n) \end{align}
(ie, the larger half of the sum)
Note: this is the part I don't fully understand
\begin{align} & \geq \log\left(\frac{n}{2}\right) + \log\left(\frac{n}{2}\right) + \cdots + \log\left(\frac{n}{2}\right) & & \text{(add them $n/2$ times)} \\ & = \log \left(\frac{n}{2} \cdot \frac{n}{2} \cdots \frac{n}{2}\right) & & \left(\frac{n}{2} \text{times}\right) \\ & = \log \left( \frac{n}{2}^{\frac{n}{2}} \right) \\ & = \frac{n}{2} \log \left(\frac{n}{2}\right) & & \text{(by log exponent rule)} \end{align}
Thus,
$\log(n!) \geq \frac{n}{2}\log(\frac{n}{2})$
$\log(n!) = \Omega(n\hspace{3pt}\log\hspace{3pt}n)$
I don't understand how finding the lower bound of $\log(n!)$ is found by getting the larger half of the sum. Why is that chosen to find $\Omega(n\hspace{3pt}\log \hspace{3pt} n)$? I feel like it's probably something obvious but it's the only thing keeping me from fully grasping the proof. If someone can enlighten me, I would appreciate it!
The same question has already an accepted answer at Computer Science SE: https://cs.stackexchange.com/questions/47561/confused-about-proof-that-logn-thetan-log-n
(I am posting this CW answer so that the question does not remain unanswered. I will also add here a link to post on meta about cross-posting.)