This is not a homework problem, although it wouldn't surprise me if it happens to exist in a textbook somewhere. Is there an explicit solution for the linear program $$\max_x c^Tx ~~ s.t. \\ d^Tx = q \\ \mathbf{1}^Tx = 1 \\ x \geq 0$$where $\mathbf{1}$ denotes a vector of all ones? Obviously, we know that there will be at most two non-zero entries $x_i$, and if we were to remove the constraint $d^Tx=q$ then the solution would simply have $x_i=0$ where $i$ is the index of the largest element of $c$.
2026-04-19 21:23:17.1776633797
Explicit solution for a linear program with two constraints
681 Views Asked by Jennifer Gao https://math.techqa.club/user/jennifer-gao/detail At
1
There are 1 best solutions below
Related Questions in OPTIMIZATION
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- optimization with strict inequality of variables
- Gradient of Cost Function To Find Matrix Factorization
- Calculation of distance of a point from a curve
- Find all local maxima and minima of $x^2+y^2$ subject to the constraint $x^2+2y=6$. Does $x^2+y^2$ have a global max/min on the same constraint?
- What does it mean to dualize a constraint in the context of Lagrangian relaxation?
- Modified conjugate gradient method to minimise quadratic functional restricted to positive solutions
- Building the model for a Linear Programming Problem
- Maximize the function
- Transform LMI problem into different SDP form
Related Questions in LINEAR-PROGRAMMING
- Proving dual convex cone property
- Linear algebra: what is the purpose of passive transformation matrix?
- Building the model for a Linear Programming Problem
- Show that $ \ x_ 0 \ $ cannot be an optimal solution
- Is there any way to model this situation in integer programming?
- How to Solve a Linear Programming Problem in $n$ Dimension Space?
- How to solve a linear program without any given data?
- Constraints for continuous path within graph with at least one obligatory node in path
- Select the smallest strict positive value from a list of variables in a linear program.
- How to add nonnegative constraint to an LP problem
Related Questions in CONVEX-OPTIMIZATION
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- Least Absolute Deviation (LAD) Line Fitting / Regression
- Check if $\phi$ is convex
- Transform LMI problem into different SDP form
- Can a linear matrix inequality constraint transform to second-order cone constraint(s)?
- Optimality conditions - necessary vs sufficient
- Minimization of a convex quadratic form
- Prove that the objective function of K-means is non convex
- How to solve a linear program without any given data?
- Distance between a point $x \in \mathbb R^2$ and $x_1^2+x_2^2 \le 4$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
This post captured my imagination for some reason!
I spent a lot of time looking at the dual problem. I managed to simplify it down to this univariate piecewise-linear minimization: $$\text{minimize} ~ \max_i c_i - (q-d_i) y$$ $y$ is actually the Lagrange multiplier for the $d^Tx=q$ constraint.
Assuming the original problem is feasible and otherwise non-trivial (i.e., $d\neq q\vec{1}$), the solution to the dual problem lies at a vertex: a point at which two of the values of $c_i - (q-d_i) y$ are equal. The two indices $i_1$ and $i_2$ that achieve this equality are the indices of the two non-zero elements of $x$. (Ties can be broken arbitrarily.) You can prove this with complementary slackness, if you are so inclined.
So the faster you can solve this piecewise linear minimization, the faster you can solve the problem.
But once it was clear that you could find a solution with no more than two nonzeros---that is, assuming it is feasible in the first place---the simplest thing to do is just to enumerate the pairs. That is, for each $i,j=1,2,\dots, n$, $i<j$ solve the two-variable problem analytically, and select the best result from all of those. In other words, for each pair of indices, consider the linear system $$\begin{bmatrix} d_i & d_j \\ 1 & 1 \end{bmatrix} \begin{bmatrix} x_i \\ x_j \end{bmatrix} = \begin{bmatrix} q \\ 1 \end{bmatrix}$$ There are three cases:
Once you've scanned all the pairs, the candidate with the best objective function is your answer. (If none of the pairs yielded a candidate, the problem is infeasible.) So you have at most $n(n-1)/2$ candidates to consider, each of which costs $O(1)$ flops to solve. That's pretty darn cheap as algorithms go.
You don't have to settle for this brute force method, if you can instead find a way to solve that dual piecewise linear minimization more quickly. But when I began to look at this, I quickly concluded that it was probably an $O(n^2)$ enterprise as well. Once I realized that, I figured that there was little point in wasting my time on anything but the most obvious approach.