Help to understand the concept of diminishing returns

852 Views Asked by At

Suppose I have this function:

$q=f(k,l)=600k^2l^2-k^3l^3$

Then,

$f_l=1200k^2l-3k^3l^2$

$f_k=1200kl^2-3k^2l^3$

$f_{ll}=1200k^2-6k^3l$

$f_{kk}=1200l^2-6kl^3$

$f_{kl}=f_{lk}=2400kl-9k^2l^2$

Now,

If we assume that:

$f_l>0$

$f_k>0$

$f_{ll} <0$

$f_{kk} < 0$

$f_{kl}=f_{lk}>0$

$RTS = \left. -\dfrac{dk}{dl}\right|_{q=q_0}= \dfrac{f_l}{f_k}$

Then,

$\dfrac{dRTS}{dl}=\dfrac{d \left( \dfrac{f_1}{f_k} \right)}{dl} \lt 0$

when $200<kl<266$ and we say that RTS is diminishing when $kl$ is in this interval.

My question is:

I want to visually see an RTS that is diminishing and an RTS that is not diminishing. What functions and parameters do I plot?

My understanding of RTS is that it is the slope of the function $f(k,l)=c$ where $c$ is a constant. So I tried doing a contour plot for various levels of $c$ but I don't see any significant differences in the contours (all are still download sloping and slowly becomes flat). Here's the plot:

enter image description here

Thank you in advance for any help provided.