Why does the partial of $f: \Delta \to \mathbb{R}^2$ fail to exist?

97 Views Asked by At

Given $f: \Delta = \{x = (x_1,x_2)| x_1+x_2 = 1, x_1 \geq 0, x_2 \geq 0\} \to \mathbb{R}^2$

Claim: $\dfrac{\partial f_i}{ \partial x_j}$ doesn't exist.

This statement is true. Can anyone please provide a way to justify this statement?

I have a feeling that this is due to compactness of the domain of $f$, which comes into conflict with the formal definition of the partial derivative. But I do not know if we could extend the definition of partial derivative for $f$ defined on compact sets. Further, I can produce large amount of examples that seems to contradict the claim:

Example: Consider $f(x_1,x_2) = \begin{bmatrix} x_1 \\ 2 x_2 \end{bmatrix} = \begin{bmatrix} f_1 \\ f_2 \end{bmatrix}$, then clearly $\dfrac{\partial f_1}{\partial x_1} = 1, \dfrac{\partial f_1}{\partial x_2} = 0$, ditto the other partials.

Can anyone please assist!

2

There are 2 best solutions below

2
On BEST ANSWER

The domain of $f$ is a line segment that contains no open interval in the $x_1$ or $x_2$ direction. So the idea of a derivative in these directions, which is what the partial derivatives of $f$ would be, is dead on arrival.

0
On

Since $x_1 + x_2 = 1$ on $\Delta$, we can write $f(x_1,x_2)= (x_1, 2x_2) = (1 - x_2, 2x_2)$ and thus "clearly" $\dfrac{\partial f_1}{\partial x_1} = 0$. Do you see a problem? You need to think about more than just the rule you use to define the function to assess its differentiability - go back to the actual definition of partial derivatives and work out what's really going on here.