I need to prove that $Var(X)-Cov(X,Y)[Var(Y)]^{-1}Cov(Y,X)$ is positive semidefinite, where $X,Y$ are random vectors that may have different dimensions. I believe that it is a generalization of Cauchy's inequlity, but I know very little about this kind of inequalities.
2026-03-30 11:17:12.1774869432
Is $Var(X)-Cov(X,Y)[Var(Y)]^{-1}Cov(Y,X)$ positive semidefinite?
39 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in COVARIANCE
- Let $X, Y$ be random variables. Then: $1.$ If $X, Y$ are independent and ...
- Correct formula for calculation covariances
- How do I calculate if 2 stocks are negatively correlated?
- Change order of eigenvalues and correspoding eigenvector
- Compute the variance of $S = \sum\limits_{i = 1}^N X_i$, what did I do wrong?
- Bounding $\text{Var}[X+Y]$ as a function of $\text{Var}[X]+\text{Var}[Y]$
- covariance matrix for two vector-valued time series
- Calculating the Mean and Autocovariance Function of a Piecewise Time Series
- Find the covariance of a brownian motion.
- Autocovariance of a Sinusodial Time Series
Related Questions in POSITIVE-DEFINITE
- Show that this matrix is positive definite
- A minimal eigenvalue inequality for Positive Definite Matrix
- Show that this function is concave?
- $A^2$ is a positive definite matrix.
- Condition for symmetric part of $A$ for $\|x(t)\|$ monotonically decreasing ($\dot{x} = Ax(t)$)
- The determinant of the sum of a positive definite matrix with a symmetric singular matrix
- Using complete the square to determine positive definite matrices
- How the principal submatrix of a PSD matrix could be positive definite?
- Aribtrary large ratio for eigenvalues of positive definite matrices
- Positive-definiteness of the Schur Complement
Related Questions in CAUCHY-SCHWARZ-INEQUALITY
- optimization with strict inequality of variables
- Proving a small inequality
- Two Applications of Schwarz Inequality
- Prove $a^2+b^2+c^2\gt \frac {1}{2018}$ given $\left({3a + 28b + 35c}\right)\left({20a + 23b +33c}\right) = 1$
- Prove that $\frac{1}{\sqrt{ab+a+2}}+ \frac{1}{\sqrt{bc+b+2}}+ \frac{1}{\sqrt{ac+c+2}} \leq \frac{3}{2}$
- Prove that $a+b+c\le \frac {a^3}{bc} + \frac {b^3}{ac} + \frac {c^3}{ab}$
- Find the greatest and least values of $(\sin^{-1}x)^2+(\cos^{-1}x)^2$
- Inequality with $ab+bc+ca=3$
- Prove the next cyclic inequality
- How to prove this interesting inequality: $\frac{5x+3y+z}{5z+3y+x}+\frac{5y+3z+x}{5x+3z+y}+\frac{5z+3x+y}{5y+3x+z}\ge 3$?
Related Questions in POSITIVE-SEMIDEFINITE
- Minimization of a convex quadratic form
- set of positive definite matrices are the interior of set of positive semidefinite matrices
- How to solve for $L$ in $X = LL^T$?
- How the principal submatrix of a PSD matrix could be positive definite?
- Hadamard product of a positive semidefinite matrix with a negative definite matrix
- The square root of a positive semidefinite matrix
- Optimization of the sum of a convex and a non-convex function?
- Proving that a particular set is full dimensional.
- Finding bounds for a subset of the positive semidefinite cone
- Showing a matrix is positive (semi) definite
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I think the easiest way to do this is to use the Schur complement formula, which states for an invertable matrix $Z$, with block decomposition $$Z=\begin{bmatrix}A & B\\ C & D \end{bmatrix},$$ then the upper-left block of $Z^{-1}$ is given by $(A-BD^{-1}C)^{-1}$.
Considering $Z$ to be the covariance matrix of the joint vector $(X,Y)$, we see then the quantity $(\text{Var}(X)-\text{Cov}(X,Y)\text{Var}(Y)^{-1}\text{Cov}(Y,X))^{-1}$ coinsides with the upper-right block of $Z^{-1}$, with respect to the block decomposition from $(X,Y)$. As $Z$ is positive definite, we see that $Z^{-1}$ is as well, and as a submatrix of a positive definite, as well as an their inverse, is also positive definite, we can conclude that $\text{Var}(X)-\text{Cov}(X,Y)\text{Var}(Y)^{-1}\text{Cov}(Y,X)$ is positive definite.