Let $S,Z \in L^\infty(\Omega,\mathcal{F},\mathbb{P})$ and $\rho \colon L^\infty \rightarrow \mathbb{R}$ be a risk measure. When is the directional derivative (Gâteaux derivative) $$\lim_{\varepsilon \rightarrow 0} \frac{\rho(Z + \epsilon S) - \rho(Z)}{\varepsilon}$$ existing? How is it linked to the coherency of a risk measure? As an example, can we compute the directional derivative of the expected shortfall $$\mathrm{ES}_\alpha(X) = \frac{1}{(1-\alpha)} \int_\alpha^1 F^{-1}_X(p) \mathrm{d}p \quad ?$$
2026-03-25 05:09:47.1774415387
Directional derivative of risk measure
449 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in FINANCE
- Compute the net present value of this project in dependance of the interest rate p.
- Maximizing profit when demand is a power function
- How do I calculate if 2 stocks are negatively correlated?
- Why does the least risky portfolio have large weight on the eigenvectors with small eigenvalues?
- FM Actuary question, comparing interest rate and Discount rate
- Monthly effective interest rate to 6-month effective interest rate
- Annual interest rate compounded monthly to monthly effective interest rate
- Withdrawing monthly from a bank for 40 years
- PMT equation result
- Fair value of European Call
Related Questions in GATEAUX-DERIVATIVE
- Prove $\lim_{h \to 0^{+}}\frac{\lVert u +hv \rVert_{\infty} - \lVert u \rVert_{\infty}}{h}=\max_{x \in M}(v\cdot \operatorname{sign}(u))$
- Lipschitz function which is Gâteaux-differentiable is Fréchet-differentiable
- Differentiability of Norms of $l_{\infty}$
- Find a counterexample that $f(x)$ is Gateaux differentiable and $\lambda(x)$ is not continous
- Directional derivatives of matrix trace functionals
- First Variation of CDF inside an Indicator Function
- Derivative of a functional with respect to another functional
- prove that the functional is $\alpha$-elliptic
- Question regarding Gateaux differentiability
- Taylor expansion for Gâteaux derivative
Related Questions in RISK-ASSESSMENT
- Risk Management System
- Compound Poisson models with completely monotone claim sizes
- Percent loss covered on average with a deductible
- Two-horse race probabilities
- $ES_X(p)$ of a Lognormal
- Toy problem: How much should I rationally be willing to pay for this hypothetical and simplified insurance?
- Utility premium
- Existence of functionals on $L^0$
- Cramér-Lundberg model for on-demand insurance
- Why must risk averse be correlated with a concave utility function?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
$\newcommand{\<}{\langle}\newcommand{\>}{\rangle}\renewcommand{\Re}{\mathbb{R}}\newcommand{\ES}{\mathrm{ES}}$This is a very interesting question and the answer is affirmative under certain conditions. We may actually show that coherent continuous risk measures are Hadamard differentiable (which is stronger than Gâteaux differentiability)
About differentiability of risk measures. Let $\mathcal{Z}=\mathcal{L}_p(\Omega, \mathscr{F}, \mathrm{P})$ and let $F:\mathcal{Z}\to\Re\cup \{+\infty\}$ be a convex extended-real-valued mapping. Recall that the convex conjugate of $F$ is the function $F^*:\mathcal{Z}^*\to\Re\cup \{+\infty\}$ defined as $$F^*(Y) = \sup_{Z\in\mathcal{Z}} \{\< Y,Z \> - F(Z)\}$$ and similarly we define the function $F^{**}:\mathcal{Z}^{**}\to\Re$.
Let us also recall that the subgradient of $F$ at a point $X_0\in\mathcal{Z}$ is the set
$$ \partial F(X_0)=\{Y\in\mathcal{Z}^*: F(X) - F(X_0) \geq \< Y, X-X_0\>, \forall X\in\mathcal{Z}\}. $$
Then, by the conjugate subgradient theorem applied to $F^{**}$ we have that for $X$ such that $F^{**}(X)$ is finite we have $$ \partial F^{**}(X) = \arg\max \{ Y\in \mathcal{Z}^{*}: \<Y,X\> - F^*(X)\}.\tag{1}\label{eq:1} $$ If $\rho:\mathcal{Z}\to\Re\cup\{+\infty\}$ is a lower semicontinuous coherent risk measure (such as the expected shortfall you mentioned), then $\rho = \rho^*$, so by \eqref{eq:1} we have $$ \partial \rho(X) = \partial \rho^{**}(X) = \arg\max \{ Y\in \mathcal{Z}^{*}: \<Y,X\> - \rho^*(X)\},\tag{2}\label{eq:2} $$ for $X\in\mathrm{dom}\rho = \{X\in\mathcal{Z}: \rho(X) < \infty\}$.
Using that fact that coherent risk measures are written as
$$ \rho(X) = \sup_{Y\in\mathfrak{W}}\< Y,X\>, $$
for some convex w*-closed set $\mathfrak{W}\subseteq \mathcal{Z}^*$, we may see that the subgradient of $\rho$ at $X$ - following \eqref{eq:2} - can be written as
$$ \partial \rho(X) = \arg\max_{Y\in\mathfrak{W}}\<Y, X\>. $$ Let us now recall the definition of the Hadamard directional derivative which is defined for $F:\mathcal{Z}\to\Re\cup\{+\infty\}$ as follows
$$ F'(X;D_0)=\lim_{t\downarrow 0, D\to D_0} \frac{F(X+tD)-F(X)}{t}. $$
If a function $F:\mathcal{Z}\to\Re\cup\{+\infty\}$ is Hadamard directionally differentiable at $X$ along a direction $D$, then it is also Gâteaux directionally differentiable and the two derivatives coincide.
Furthermore, we know that if a convex mapping $F:\mathcal{Z}\to\Re\cup\{+\infty\}$ is finite-valued at $X$ and continuous at $X$, then $F$ is also subdifferentiable at $F$, $\partial F(X)$ is nonempty, convex, bounded and w*-compact and $F$ is Hadamard-diferentiable at $X$ with
$$ F'(X;D) = \sup_{Y\in\partial F(X)}\< Y, D\>. $$
Differentiability of the expected shortfall. The expected shortfall $\ES_\alpha:\mathcal{Z}\to\mathbb{R}\cup\{+\infty\}$ can be written in the following dual representation
$$ \ES_{\alpha}[X] = \sup_{Y\in\mathfrak{A}_\alpha}\< Y, X\> $$
where
$$ \mathfrak{A}_\alpha = \{ Y\in\mathcal{Z}^*: \mathbb{E}[Y]=1, Y\in [0,\alpha^{-1}] \text{ a.s.} \} $$ The expected shortfall is subdifferentiable with \begin{align*} \partial(\ES_{\alpha})[Z] &= \arg\max_{Y\in\mathfrak{A}} \<Y,Z\> \\ &= \arg\max_{Y\in\mathcal{Z}^*} \left\{ \<Y,Z\>: Y\in [0,\alpha^{-1}], \text{a.e. }, \mathbb{E}[Y]=1 \right\}. \end{align*}
To further proceed, let us assume that $X$ is a random variable described by a continuous cumulative probability function $H_X$ and define $t^*$ to be $$ t^* = \inf\{t: H_X(t)\geq 1-\alpha\}, $$ and further assume that $\{t: H_X(t) = 1-\alpha\}$ is a singleton for a given $\alpha$.
Interpretation: the expected shortfall is equivalently written as $$\ES_\alpha(X) = \inf\{t\in\Re: \phi(t):=t+\mathbb{E}[X-t]_+\}.$$Then, $\phi$ attains its minimum at $t^*$.
Relaxing the equality constraint $\mathbb{E}[Y]=1$ we have the Lagrangian \begin{align*} L(Y,\lambda; Z) &= \<Y, Z\> + \lambda(1-\mathbb{E} [Y])\\ &= \<Y, Z\> + \lambda - \<\lambda, Y\>\\ &= \<Y, Z - \lambda \> + \lambda. \end{align*}
We can now introduce the dual function
$$ \begin{aligned} q(\lambda) &= \sup_{Y\in [0,\alpha^{-1}]} L(Y,\lambda)\\ &= \sup_{Y\in [0,\alpha^{-1}]} \<Y, Z - \lambda \> + \lambda\\ &= \sup_{Y\in [0,\alpha^{-1}]} \int Y(Z-\lambda)\mathrm{dP} + \lambda, \end{aligned} $$ the supremum is attained for $Y=\alpha^{-1}1_{[Z-\lambda\geq 0]}$, so $$ q(\lambda) = \alpha^{-1}\mathbb{E}[Z-\lambda]_+ + \lambda. $$ The dual problem is $$ \inf_{\lambda\in\Re}\ \alpha^{-1}\mathbb{E}[Z-\lambda]_+ + \lambda. $$ The set of its minimizers is a bounded set, so we have strong duality.
Under the above assumptions ($H_X$ being continuous, $\{t: H_X(t)=1-\alpha\}$ being a singleton and for $t^*$ as above), $Y\in\partial \ES_\alpha(X)$ if and only if
$$ \mathbb{E}[Y]=1\\ X>t^* \implies Y = \alpha^{-1}\\ X<t^* \implies Y=0\\ X=t^* \implies Y\in [0,\alpha^{-1}] $$
Finally, the directional derivative of $\ES_\alpha$ at $X$ along $D$ will be
$$ \ES'_\alpha(X;D) = \sup_{Y\in\partial\ES_\alpha(X)}\<Y,D\> $$