What does it mean for a function to be differentiable/continuous when the input is a function?

74 Views Asked by At

I have the loss function $$L(h) = \sum_{i=1}^{n}(h(x_i) - y_i)^2$$

$h$ is a function that spits out the predicted value when fed in a vector $x_i$. The domain then for $h$ is $\mathbb R^d$ and the codomain is $\mathbb R$. My homework is asking if $L$ is continuous or differentiable. The problem is I don't even know what this means because the input is a function from $\mathbb R^d \to \mathbb R$ and not a single real-valued input. How are we defining continuity/differentiability for functions from $\mathbb R^d \to \mathbb R$ to $\mathbb R$? I am only familiar with the $\mathbb R \to \mathbb R$ epsilon-delta definition of continuity. This class is for my intro ML course, so nothing too rigorous is necessary.

1

There are 1 best solutions below

0
On

As you are not in category theory or something weirder, what they mean in this context is if the composition is differentiable with respect to x_i. If h is differentiable, then the composition certainly is. You should differenciate L(h(x)) by x --> Two times the sum of the terms without the exponent and each of them multiplied by the derivative of h.