Assume we want a divide-and-conquer algorithm that finds the max and min of a set $S$ with $n = 2^k$ elements, e.g. mergesort. The recurrence for time complexity is
$T(n)=2*T(n/2) +2$, for $n>2$, and $T(2)=1$
Using domain transformation we can set $n=2^k$, then $T(n)=a_k=T(2^k)$
Since $2^k/2=2^{k-1}=a_{k-1}$,
$a_k = 3/2*2^k-2$
Solving this gives us $T(n)=3/2*n-2$, which is clearly $O(n)$
But, from the master theorem, $T(n)$ is in the form $aT(n/c)+bn$, and $a=c=2$, so we should be getting $n\log{n}$. This is also what most online sources say is the time complexity of mergesort. Why can't I get $n\log{n}$ as solution?
You are confusing several different things:
Mergesort is a sorting algorithm. Not every divide-and-conquer algorithm is mergesort.
If you apply the master theorem then you get $\Theta(n)$, since your recurrence is not of the form $T(n) = aT(n/c) + bn$ but rather of the form $T(n) = aT(n/c) + b$.
Your recurrence only measures the number of comparisons, so in this case this is fine, since the running time is proportional to the number of comparisons (for this algorithm).