Suppose I am given two vectors, $x, y$ such that $x\prec y$ ($x$ is majorized by $y$). Suppose further that these are infinite dimensional vectors, such that $\sum_i x_i = \sum_i y_i = 1$, and they are already arranged in decreasing order. We know as a consequence of the majorization relation that we can write $x = Dy$, where $D$ is a doubly stochastic matrix ($\sum_i a_{ij} = \sum_j a_{ij} = 1$, $a_{i,j} \geq 0 \ \forall \ i,j$ ). Is it possible to analytically find $D$? If so, is there some continuous-variable method than we can use to achieve this?
2026-03-14 14:14:26.1773497666
Find Linear Map between Majorized Vectors
74 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in MATRIX-EQUATIONS
- tensor differential equation
- Can it be proved that non-symmetric matrix $A$ will always have real eigen values?.
- Real eigenvalues of a non-symmetric matrix $A$ ?.
- How to differentiate sum of matrix multiplication?
- Do all 2-variable polynomials split into linear factors over the space of $2 \times 2$ complex matrices?
- Big picture discussion for iterative linear solvers?
- Matrix transformations, Eigenvectors and Eigenvalues
- Jordan chevaley decomposition and cyclic vectors
- If $A$ is a $5×4$ matrix and $B$ is a $4×5$ matrix
- Simplify $x^TA(AA^T+I)^{-1}A^Tx$
Related Questions in MAJORIZATION
- Majorization of $(5,5,0) $ by $(10,0,0)$
- Do convex combinations of projection matrices majorize the probability vector, i.e. $\sum_k p_k P_k\succeq \boldsymbol p$?
- Do Shannon entropy and majorization preorder measure the same "type of disorder"?
- when does schur convex function imply weak majorization
- An exercise on a function $f$ satisfying $f(x+y)=f(x)+f(y)$
- Continuity of a multivariable function: doubts on how to reason
- Difference between majorization and weak majorization.
- Prove or disprove the following generalization :$\sum_{i,j=1}^{n}f\left(a_{i}a_{j}\right)>n^{2}f\left(\sum_{i,j=1}^{n}\frac{a_{i}a_{j}}{n^{2}}\right)$
- Schur-concavity vs concavity of a fucntion.
- Example of combination of Young's functions
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I believe there is a general procedure for solving this problem. We have two sequences $x = (x_1, x_2, \cdots, x_n)$ and $y = (y_1, y_2, \cdots, y_n)$ such that $x\prec y$. Suppose these are already in descending order (if not, a permutation of the vectors yields the desired ordering). First, we identify that the following two conditions are true:
$$ y_n \leq x_1 \leq y_1 \\ y_k \leq x_1 \leq y_{k-1}, $$
for some $k$. We can thus write:
$$ x_1 = t_1y_1 + (1-t_1)y_k $$
and define new vectors $x^{'}, y^{'}$ such that:
$$ x^{'} = (x_2, \cdots, x_n)\\ y^{'} = (x_2, \cdots, y_{k-1}, (1-t_1)y_1 + t_1y_k, y_{k+1}, \cdots x_n) $$
Now, we can write
$$ y^{'}_n \leq x^{1}_1 \leq y^{1}_1 \\ y^{'}_j \leq x^{1}_1 \leq y^{'}_{j-1}, $$
for some index $j$. We can thus write:
$$ x^{'}_1 = t_2y^{'}_1 + (1-t_2)y^{'}_j $$
Now we may define new vectors $x^{''}, y^{''}$ such that:
$$ x^{''} = (x_3, \cdots, x_n)\\ y^{''} = (x_3, \cdots, y_{j-1}, (1-t_2)y_1 + t_2y_j, y_{j+1}, \cdots, y_{k-1}, (1-t_1)y_1 + t_1y_k, y_{k+1}, \cdots x_n) $$
This procedure thus continues until we can write $x = Dy$, where $D$ is a doubly stochastic matrix ($\sum_i a_{ij} = \sum_j a_{ij} = 1$, $a_{i,j} \geq 0 \ \forall \ i,j$) comprised of T-transforms, i.e. $D = T_r\cdots T_2 T_1$. T-transforms act on just 2 components of a vector, and act as the identity on all other components; these have the form:
$$ \begin{bmatrix} t&1-t\\ 1-t & t \end{bmatrix} $$
These transforms can be thought of as convex combinations of the identity map and a permutation; thus, $D$ is a convex combination of the identity map and permutations.
Let's consider an example:
$$ x = (5, 3, 2)\\ y = (6, 3, 1) $$
We have that:
$$ y_n = 1 \leq x_1 = 5 \leq y_1 = 6\\ y_k = y_2 = 3 \leq x_1 = 5 \leq y_{k-1} = y_1 = 6 $$
Therefore, we have that
$$ 5 = x_1 = t_1y_1 + (1-t_1)y_k = 6t_1 + 3(1-t_1)\\ \therefore t_1 = \frac{2}{3} $$
We now consider the reduced vectors:
$$ x^{'} = (3, 2)\\ y^{'} = ((1-t_1)y_1 + t_1y_k, 1) = ((\frac{1}{3})\cdot6 + \frac{2}{3}\cdot3, 1) = (4,1) $$
Note that $x^{'}\prec y^{'}$. Solving for the second t-parameter:
$$ y^{'}_n = 1 \leq x^{'}_1 = 3 \leq y^{'}_1 = 4\\ y^{'}_k = y^{'}_2 = 1 \leq x^{'}_1 = 3 \leq y^{'}_{k-1} = y^{'}_1 = 4\\ 3 = x^{'}_1 = t_2y^{'}_1 + (1-t_2)y^{'}_k = 4t_2 + 1(1-t_2)\\ \therefore t_2 = \frac{2}{3} $$
Now note that:
$$ x^{''} = (2)\\ y^{''} = ((1-t_2)y^{'}_1 + t_2y^{'}_k) = ((\frac{2}{3})\cdot4 + \frac{2}{3}\cdot2) = (2) $$
Hence, we have solved for the overall transformation matrix:
$$ x = Dy = T_2 T_1 y\\ \begin{pmatrix}5\\3\\2 \end{pmatrix} = \begin{bmatrix} 1 & 0 & 0\\ 0&t_2&1-t_2\\ 0&1-t_2 & t_2 \end{bmatrix} \begin{bmatrix} t_1&1-t_1 &0\\ 1-t_1 & t_1&0\\ 0&0&1 \end{bmatrix} \begin{pmatrix}6\\3\\1 \end{pmatrix} $$