I am trying to find a counterexample for the following matrix inequality which I suspect not to hold: $$P^+(A-B)\leq A+B$$ where $A,B$ are positive semidefinite matrices and $P^+(X)$ the projector onto the positive subspace of $X$.
2026-04-24 12:35:30.1777034130
Does $A,B>0$ imply that the projection onto the positive subspace of $A-B$ is smaller than $A+B$ for positive semidefinite matrices?
85 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in INEQUALITY
- Confirmation of Proof: $\forall n \in \mathbb{N}, \ \pi (n) \geqslant \frac{\log n}{2\log 2}$
- Prove or disprove the following inequality
- Proving that: $||x|^{s/2}-|y|^{s/2}|\le 2|x-y|^{s/2}$
- Show that $x\longmapsto \int_{\mathbb R^n}\frac{f(y)}{|x-y|^{n-\alpha }}dy$ is integrable.
- Solution to a hard inequality
- Is every finite descending sequence in [0,1] in convex hull of certain points?
- Bound for difference between arithmetic and geometric mean
- multiplying the integrands in an inequality of integrals with same limits
- How to prove that $\pi^{e^{\pi^e}}<e^{\pi^{e^{\pi}}}$
- Proving a small inequality
Related Questions in OPERATOR-THEORY
- $\| (I-T)^{-1}|_{\ker(I-T)^\perp} \| \geq 1$ for all compact operator $T$ in an infinite dimensional Hilbert space
- Confusion about relationship between operator $K$-theory and topological $K$-theory
- Definition of matrix valued smooth function
- hyponormal operators
- a positive matrix of operators
- If $S=(S_1,S_2)$ hyponormal, why $S_1$ and $S_2$ are hyponormal?
- Closed kernel of a operator.
- Why is $\lambda\mapsto(\lambda\textbf{1}-T)^{-1}$ analytic on $\rho(T)$?
- Show that a sequence of operators converges strongly to $I$ but not by norm.
- Is the dot product a symmetric or anti-symmetric operator?
Related Questions in EXAMPLES-COUNTEREXAMPLES
- A congruence with the Euler's totient function and sum of divisors function
- Seeking an example of Schwartz function $f$ such that $ \int_{\bf R}\left|\frac{f(x-y)}{y}\right|\ dy=\infty$
- Inner Product Uniqueness
- Metric on a linear space is induced by norm if and only if the metric is homogeneous and translation invariant
- Why do I need boundedness for a a closed subset of $\mathbb{R}$ to have a maximum?
- A congruence with the Euler's totient function and number of divisors function
- Analysis Counterexamples
- A congruence involving Mersenne numbers
- If $\|\ f \|\ = \max_{|x|=1} |f(x)|$ then is $\|\ f \|\ \|\ f^{-1}\|\ = 1$ for all $f\in \mathcal{L}(\mathbb{R}^m,\mathbb{R}^n)$?
- Unbounded Feasible Region
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $A=\frac{1}{10}\begin{bmatrix} 2&4\\4&8\end{bmatrix},$ $B=\frac{1}{10}\begin{bmatrix}9&-3\\-3&1\end{bmatrix}.$ Then $A-B=\frac{1}{10}\begin{bmatrix}-7&7\\7&7\end{bmatrix},$ which has eigenvalues $\pm \sqrt{\frac{49}{50}},$ with corresponding eigenvectors $\begin{bmatrix}1\\1\pm\sqrt{2}\end{bmatrix}.$ Thus $P^{+}(A-B)=\frac{\sqrt{49/50}}{6+2\sqrt{2}}\begin{bmatrix}1&1+\sqrt{2}\\1+\sqrt{2}&5+2\sqrt{2}\end{bmatrix}.$ Letting $x=\frac{1}{\sqrt{6+2\sqrt{2}}}\begin{bmatrix}1\\1+\sqrt{2}\end{bmatrix},$ we see that $x^{*}P^{+}(A-B)x=\sqrt{\frac{49}{50}}\approx 0.9899,$ while $x^{*}(A+B)x=\frac{4+2\sqrt{2}}{6+2\sqrt{2}}\approx 0.7735.$ Therefore, $P^{+}(A-B)\not\leq A+B$ in this case.
Admittedly, $A$ and $B$ were both positive semidefinite here, but a slight modification yields a counterexample with $A'$ and $B'$ positive definite. Letting $\varepsilon>0,$ observe that $A+\varepsilon I,$ $B+\varepsilon I$ are positive definite, $(A+\varepsilon I)-(B+\varepsilon I)=A-B,$ and $x^{*}(A+B+2\varepsilon I)x=x^{*}(A+B)x+2\varepsilon,$ so for $\varepsilon<0.1,$ we still have $x^{*}(A'+B')x<x^{*}P^{+}(A'-B')x.$
Several factors are important to making the example above work. They both have one large eigenvalue and one $0$ (or close to $0$) eigenvalue, and the eigenvectors corresponding to the large eigenvalues are nearly orthogonal ($\frac{\langle [1,2],[3,-1]\rangle}{\sqrt{5}\sqrt{10}}=\frac{1}{5\sqrt{2}}$). Since $A$ and $B$ are roughly the same size, this means that the eigenvectors of $A-B$ are changed essentially as much as possible from the large eigenvector of either $A$ or $B$. This means that the eigenvector of $A-B$ corresponding to the positive eigenvalue is relatively far from either of the large eigenvectors of $A$ or $B$, which means that when we compute $x^{*}P^{+}(A-B)x$ for the largest eigenvector $x$, scaled to unit length, we obtain the largest eigenvalue of $P^{+}(A-B),$ but when we compute $x^{*}Ax$ (or $x^{*}Bx$), we are getting an average of the largest eigenvalue, $1$, and the smallest eigenvalue, $0$ (or $\varepsilon,$ if we consider the second example). It then turns out that the sum of these two ends up being less than the positive eigenvalue of $A-B$ (here is where a bit of tinkering might be necessary to make sure this happens as desired.
In higher dimensions, I would suggest considering matrices that are again of the form $uu^{T}+\varepsilon I$ for a unit vector $u,$ and with the vectors $u$ close to being orthogonal (note that if they are orthogonal, the matrices $A$ and $B$ will be simultaneously diagonalizable, and the counterexample will be impossible). Then $A-B=uu^{T}-vv^{T}$ will have one positive and one negative eigenvalue, and the positive unit eigenvector should satisfy $x^{*}P^{+}(A-B)x>x^{*}(A+B)x$ when $\varepsilon$ is sufficiently small (this obviously requires proof).