I'm trying to do a formal comprehensive proof of a certain recursive algorithm (Divide-and-conquer). My problem is that all resources I know always do some simplifications, and I don't manage to find a formal proof without simplifications:
Simplified complexity
$T(n) = 2T(\frac{n}{2}) + an$, $\quad T(1)=c$, $\quad\exists k: n = 2^k $
$\Longrightarrow T(2^k) = 2T(2^{k-1}) + a 2^k$, $\quad T(2^0) = c$
We define $\hat{T}(k):=T(2^k)$
$\Longrightarrow \hat{T}(k) = 2\hat{T}(k-1) +a2^k$
Telescoping we easily arrive at:
$\hat{T}(k)= 2\hat{T}(k-1) +a2^k = 2\left[2\hat{T}(k-2) +a2^{k-1}\right] + a2^k = \dots = 2^kc + ka2^k$
$\Longrightarrow T(n) = nc + an\log_2{n} \in \Theta(n\log{n})$
Pedantic complexity:
In a general case we cannot assume that $\exists k:n=2^k$, and the constants $a$ and $c$ are also simplifications. The "pedantic" or general case looks like this:
$T(n) = T(\lfloor\frac{n}{2}\rfloor) + T(\lceil\frac{n}{2}\rceil) + f(n)$, $\quad T(1)= g(n)$, $\quad f(n)\in \Theta(n)$, $\quad g(n) \in \Theta(1)$
$\dots$
missing steps
$\dots$
$\Longrightarrow T(n) \in \Theta(n\log{n})$
I am not able to deduce the complexity rigurously using these: floor, ceiling and the definitions of $f(n) \in \Theta(n)$ and $g(n) \in \Theta(1)$. I would be happy for some help or guidance and how to approach it.
Note: The definition I'm using is $f(n)\in \Theta(h(n)) \Leftrightarrow \exists C_L, C_R \in \mathbb{R}^+, n_0\in \mathbb{N}: C_Lh(n)\leq f(n) \leq C_Rh(n), \quad \forall n\geq n_0$
Note that if $n$ is not a power of $2$, there exists an integer $k$ s.t. $2^{k} < n < 2^{k+1}$. Run your simplified argument on $2^{k}$ and $2^{k+1}$.
Under the assumption that $T(n)$ is non-decreasing (which is a reasonable/usual assumption in Algorithms, though one that is often not explicitly mentioned), you will obtain that $T(2^{k}) \leq T(n) \leq T(2^{k+1})$. The bound $T(2^{k}) \leq T(n)$ yields the Big-Omega bound, and the bound that $T(n) \leq T(2^{k+1})$ will yield the Big-O bound.
In this sense, bounding arguments are your friend and are perfectly rigorous.
Intuitively, the picture in my head is a recursion tree. If $n$ is not a power of $2$, then level $k$ is complete, while level $k+1$ has some but not all of the leaves. If we remove the leaves we do have at level $k+1$, we are only counting the work at the first $k$ levels. This is the lower bound $T(2^{k}) \leq T(n)$. Similarly, if we complete the $(k+1)$st level, we are over-counting the work done. So $T(n) \leq T(2^{k+1})$.
I hope this helps!