Optimization and Deep Learning: Is ReLU a Closed Function

74 Views Asked by At

The ReLU activation function in deep learning is given by $\text{ReLU}: \mathbb R\rightarrow \mathbb R, x \mapsto \max\left\{0, x\right\}$. I was asking myself whether this function, which is convex, is also closed. This is the general definition of closed:

Definition. A function $J: X\rightarrow \mathbb R_{\infty} := \mathbb R \ \cup \left\{ \pm \infty \right\}$ is closed if its epigraph is closed.

Definition. The epigraph of $J$ is given by $\text{epi}(J) := \left\{ (x, \alpha)\in X\times \mathbb R \ \vert \ J(x) \leq \alpha \right\}$.

I sketched myself the $ReLU$ functions and its epigraphs, and using that every function is closed if and only if contains all its limit points, it looks like $\text{epi}(\text{ReLU})$ is closed. But I am not sure what a mathematically rigorous proof would look like..

2

There are 2 best solutions below

1
On

I don't know if I can give you a rigorous proof, but I don't think you need one. I think you got to the same conclusion that the epigraph of the ReLU function is $\{ (x, y) \colon x \geq 0, y \leq x\}$, which is closed.

0
On

The function is closed if and only if the epigraph is closed. But the ReLu function is continuous, which implies closedness of epigraph.

The epi graph is the set: $$ \{ (x,\alpha) : \max(0,x) \le \alpha\}, $$ which is the preimage of the closed set $[0,+\infty)$ under the continuous map $x\mapsto \max(x,0)-\alpha$.