For $n \in \mathbb{N}$ we define $Q(n) = M(n,n)$ with:
$$ M(i,j) = M(i-1,j) + M(i-1,j-1) + M(i,j-1) $$ and
$$ M(i,0) := M(0,i) := i \mbox{ } \mbox{ } \forall i \geq 0 $$
Show that $Q(n)$ (regarding the recursion above ) does take exponential time complexity. Moreover show that there is an algorithm, which does take quadratic time.
My thoughts:
Regarding the first question I think I understand the problem. I choose $ n = 2 $ and consider the recursion represented as a tree. The subproblems overlap several times. ( not like Mergesort, where the subproblems are disjoint). For example we have to compute $M(1,1)$ three times for $n=2$. So this would be the reason for exponential time complexity, I guess.
For the second question I would say that we can replace the "top-down" strategy of the recursion with a "bottom-up" strategy. So we start with a subproblem of a "trivial size" and begin to focus of "greater" subproblems. We can note every result in table and use them if we need them for greater subproblems.
That were my thoughts for the two questions.
Let the number of function calls required for computing $M(i,j)$ the naive way be $F(i,j)$. Then for sufficiently large $n$, and since $F()$ and $M()$ are symmetric in their arguments, $$F(n,n)=F(n,n-1)+F(n-1,n)+F(n-1,n-1)=2F(n,n-1)+F(n-1,n-1)$$ $$=2(F(n,n-2)+F(n-1,n-1)+F(n-1,n-2))+F(n-1,n-1)$$ $$>2F(n-1,n-1)+F(n-1,n-1)=3F(n-1,n-1)$$ This is a recurrence inequality on the number of function calls needed for $Q(n)$, and it is easily seen that these call counts must grow exponentially with an exponent of at least $3$.
To compute $Q(n)$ in quadratic time, compute $M(i,j)$ for increasing values of $i+j$ – from $0$ to $2n$ – and cache the results so they need not be recomputed again. This time-saving technique is called memoisation.