Consider the problem \begin{align*} \max_{X\in\mathcal{F}} \int_{0}^1 X(t) \phi(t)dt \\ \int_{0}^1 X(t) \psi(t) dt= 0 \end{align*} where $\mathcal{F}$ is a "well-behaving" set of functions. (In my application, $\mathcal{F}$ is defined as the set of CDFs that are second-order stochastically dominated by some given CDF $F_0$.) I am looking for a formal reference/argument that relates this problem to the one obtained from writing the Lagrangian $$\max_{X\in\mathcal{F}} \int_{0}^1 X(t) (\phi(t)-\lambda \psi(t))dt, $$ and in particular, formally showing that the optimal solution in the first problem corresponds to a local maxima of the second problem. Any ideas/references would be helpful!
2026-03-27 23:31:04.1774654264
constrained optimization in calculus of variations
255 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in CONVEX-OPTIMIZATION
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- Least Absolute Deviation (LAD) Line Fitting / Regression
- Check if $\phi$ is convex
- Transform LMI problem into different SDP form
- Can a linear matrix inequality constraint transform to second-order cone constraint(s)?
- Optimality conditions - necessary vs sufficient
- Minimization of a convex quadratic form
- Prove that the objective function of K-means is non convex
- How to solve a linear program without any given data?
- Distance between a point $x \in \mathbb R^2$ and $x_1^2+x_2^2 \le 4$
Related Questions in CALCULUS-OF-VARIATIONS
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Weak formulation of Robin boundary condition problem
- Why is the index of a harmonic map finite?
- Variational Formulation - inhomogeneous Neumann boundary
- Relationship between Training Neural Networks and Calculus of Variations
- How to prove a Minimal Surface minimizes Surface Tension
- Derive the Euler–Lagrange equation for a functional a single variable with higher derivatives.
- Does the covariant derivative commute with the variational derivative?
- Derivative of a functional w.r.t. a single point?
- calculus of variations with double integral textbook?
Related Questions in LAGRANGE-MULTIPLIER
- How to maximize function $\sum_{i=1}^{\omega}\max(0, \log(x_i))$ under the constraint that $\sum_{i=1}^{\omega}x_i = S$
- Extrema of multivalued function with constraint
- simple optimization with inequality restrictions
- Using a Lagrange multiplier to handle an inequality constraint
- Deriving the gradient of the Augmented Lagrangian dual
- Lagrange multiplier for the Stokes equations
- How do we determine whether we are getting the minimum value or the maximum value of a function using lagrange...
- Find the points that are closest and farthest from $(0,0)$ on the curve $3x^2-2xy+2y^2=5$
- Generalized Lagrange Multiplier Theorem.
- Lagrangian multipliers with inequality constraints
Related Questions in EULER-LAGRANGE-EQUATION
- Showing solution to this function by Euler-Lagrange
- Derive the Euler–Lagrange equation for a functional a single variable with higher derivatives.
- Functional with 4th grade characteristic equation
- derivative of double integral in calculus of variation
- When is the Euler-Lagrange equation trivially satisfied?
- Euler-Lagrange and total derivative of partial derivative for function of two variables
- Energy Functional from the Euler-Lagrange Equations
- Find differential equation using variation principle and lagrangian
- Euler-Lagrange equations without lower boundary conditions
- Finding First Variation
Related Questions in KARUSH-KUHN-TUCKER
- Karush-Kuhn-Tucker in infinite horizon
- KKT Condition and Global Optimal
- Rewrite an Optimization problem for $\textrm {min } \:\textrm {max} \{f_1, \dots, f_N\}$
- Minimize $x^T A y$, subject to $ x^Ty\geq 0$, where $A=\Phi^T\Phi$ is symmtric and semi-positive definite.
- Why consider $s^T\nabla g_j = 0$ for sufficient condition in optimization
- Constrained optimization where the choice is a function over an interval
- KKT example nonlinear programming
- KKT conditions for general conic optimization problem
- KKT: What happens if $x^{*}$ is not in the borderline of inequality constraint
- How do I find KKT Conditions for the Quadratic Function?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
The argument is really not any different from optimizing over $\mathbb R^n$. I will prove that the statement you want holds assuming a form of Slater's condition: $\mathcal F$ is convex, and $0$ is an interior point of $\left\{\int_{0}^1 X(t) \psi(t) dt : X \in \mathcal F\right\}$.
For convenience, let \begin{align} f(X) &= \int_{0}^1 X(t) \phi(t)dt \\ g(X) &= \int_{0}^1 X(t) \psi(t) dt \\ h(\lambda) &= \sup_{X \in \mathcal F} \Big\{f(X) - \lambda g(X)\Big\} \end{align} Let $M(z)$ be the maximum value (or maybe the supremum) of $f(X)$ over all $X \in \mathcal F$ such that $g(X) =z$. Its domain is the set $\{g(X) : X \in \mathcal F\}$. We allow $M(z) = +\infty$ when things work out that way; nothing special happens in this case. We have $M(0) = f(X^*)$, where $X^*$ is the optimal solution to our problem.
Assuming $\mathcal F$ is a convex domain, $M$ is a concave function of $z$. To see this, first suppose for simplicity that $M(z_1) = f(X_1)$ and $M(z_2) = f(X_2)$ with $g(X_i) = z_i$. Then for any $t \in [0,1]$, if $z = tz_1 + (1-t)z_2$, let $X = tX_1 + (1-t)X_2$; we get $g(X) = z$, so $$ M(z) \ge f(X) = t f(X_1) + (1-t) f(X_2) = t M(z_1) + (1-t) M(z_2) $$ proving that $M$ is concave. When $M(z_i)$ can be approached by not reached by any $X$ with $g(X) = z_i$, we consider a sequence of $X$'s approaching it and the same thing falls out eventually.
By concavity of $M$, there is a slope $\lambda^*$ such that $M(z) \le M(0) + \lambda^* z = f(X^*) + \lambda^* z$... assuming $0$ is an interior point of the domain of $M$. To see that this condition is necessary, imagine a case where $M(z) = \sqrt{z}$; then the tangent line at $0$ is vertical. We can set $\lambda^* = M'(0)$ when that derivative exists; if $0$ is a corner point of $M$, there are many valid choices of $\lambda^*$.
For any $X \in \mathcal F$, we have $M(g(X)) \ge f(X)$, because $X$ is feasible for the optimization problem defining $M(g(X))$. Therefore $f(X) \le f(X^*) + \lambda^* g(X)$, or $f(X^*) \ge f(X) - \lambda^* g(X)$.
Since $g(X^*) = 0$, we have $f(X^*) - \lambda^* g(X^*) \ge f(X) - \lambda^* g(X)$. In other words: